You are on page 1of 114

DUDLEY KNOX LIBRA OL NAVAL PCS: MONTEREY, CALIFORNIA 91943

NAVAL POSTGRADUATE SCHOOL


Monterey, California

THESIS
ALGORITHMS AND HEURISTICS FOR TIME -WINDOW-CONSTRAINED

TRAVELING SALESMAN PROBLEMS


by
Chun, Bock Jin

and
Lee,

Sang Heon

September 1985
Thesis Advisor

Richard E. Rosenthal

Approved for public release; distribution is unlimited

T226058

SECURITY CLASSIFICATION OF THIS PAGE (Whan Data

Entered)

REPORT DOCUMENTATION PAGE


1.

READ INSTRUCTIONS BEFORE COMPLETING FORM


3.

REPORT NUMBER

2.

GOVT ACCESSION NO

RECIPIENT'S CATALOG NUMBER

4.

TITLE (and

Subtitle)

5.

TYPE OF REPORT

&

PERIOD COVERED

Algorithms and Heuristics for Time-Window-Constrained Traveling Salesman Problems


S.

Master's Thesis; September 1985


PERFORMING ORG. REPORT NUMBER

7.

AUTHORfs;

8.

CONTRACT OR GRANT NUMBERS

Chun, Bock Jin Lee, Sang Heon


9.

PERFORMING ORGANIZATION NAME ANO ADDRESS

10.

Naval Postgraduate School Monterey, California 939^3-5100


11.

PROGRAM ELEMENT PROJECT, TASK AREA & WORK UNIT NUMBERS

CONTROLLING OFFICE NAME AND ADDRESS

12.

REPORT DATE

Naval Postgraduate School Monterey, California 939^3-5100


14.

September 19^5
13.

NUMBER OF PAGES

103
MONITORING AGENCY NAME
4

ADDRESSf//

dllferent /rem Controlling Office)

15.

SECURITY CLASS,

(of this report)

15.

SCHEDULE
16.

DECLASSIFICATION DOWNGRADING

DISTRIBUTION ST AT EMEN T

(of this Report)

Approved for public release; distribution is unlimited.

17.

DISTRIBUTION STATEMENT

(ol the abstract

entered In Block 20,

It

different from Report)

18.

SUPPLEMENTARY NOTES

19.

KEY WORDS

(Continue on reverse aide

II

necessary and Identity by block number)

Heuristic, Algorithm, Time Window, Hard Time Window, Soft Time Window, Slack, Branch and Bound, Nearest Neighbor, Penalty Cost, Traveling Salesman Problem, State-Space Relaxation

20.

ABSTRACT fConllnus on reverse aide It neceaaary and Identity by block number) Th i S t hesis reports on methTwo ods for solving traveling salesman problems with time-win dow constraints. types of time windows are considered: hard time windows, which are inviolable and soft time windows, which are violable at a cost. For both cases, we develop several heuristic procedures including some that are based on Stewart's [Ref. 6] effective heuristics for the traveling salesman problem without timewindow constraints. In addition, we develop exact algori thms for each case, which are based on the state-space relaxation dynamic pro gramming method of Computational experience is reChristofides Mingozzi, and Toth iRef. 5]. ported for all the heuristics and algorithms we develop.
, ,

W
DD

FORM
1

JAN

73

1473

EDITION OF
S

NOV

65

IS

OBSOLETE
SECURITY CLASSIFICATION OF THIS PAGE (When Data
Entered)

N 0102- LF- 014- 6601

Approved for public release; distribution is unlimited


Algorithms and Heuristics for Time -Window-Cons trained Traveling Salesman Problems
by
Chun, Bock Jin Major, Republic of Korea Air Force B.S., Korea Air Force Academy, 1976 and Lee Sang Heon Major, Republic of Korea Army B.S., Korea Military Academy, 1977
,
.

Submitted in partial fulfillment of the requirements for the degree of

MASTER OF SCIENCE IN OPERATIONS RESEARCH from the NAVAL POSTGRADUATE SCHOOL September 1985

ABS1BACT

thesis reports on methods for solving traveling salesman problems with time-window constraints- Two types of hard time windows, which are time windows are considered: which are violable at a inviolable, and soft time windows,
This
cost.
For both cases,

we

develop several heuristic proceare based on

dures,

including some
heuristics

that

Stewards [Ref-6]
salesman
problem
the
of

effective
exact

for the

traveling

without time-window

constraints.
each case,

In addition,

we develop

algorithms for

which are

based on
metnod

state-space
Christo fides,
we develop.

relaxation
Mingozzi,

dynamic

programming

and Toth [Ref.5]. Computational experience is reported for all the heuristics and algorithms

TABLE OP COHTENTS

I.

INTROEDC1ION
A.

9
9

VIE VIEW

B. C.

THE TRAVELING SALESMAN PROBLEM

11

TSP WITH TIME WINDOW CONSTRAINTS

13

II.

HEURISTIC TSP SOLUTION


A-

16
16

OVERVIEW
1.

lour Construction Procedures

.......
.

16

2.
3.
E.

Tour Improvement Procedures

21

Composite Procedure
Algorithm
Example

.23
23

CCAO
1.

.23
24

2.

3.

Computational Results

..........27
.

III.

THE TSF WITH HARD


A.
E.

TIME WINDOW CONSTRAINTS

...

32

INTRODUCTION
HEURISTIC SOLUTION TECHNIQUES FOR HARD
TIME WINDOWS
1.

32

33

Nearest Neighbor
SCCO
SCAO

.33
35

2.

3.
4.
C.

42
43

SLACK

EXACT SOLUTION TECHNIQUES FCR HARD TIME

WINDOWS
1.

4o

State-Space Relaxation Procedure

....

4o

^.

Additional Condition
Eranch and Bound Procedure

.50
53

3.

IV.

THE TSP WITH SOFT TIME WINDOW CCNST8AINTS


A.
E.

57

INTRODUCTION
HECRISTIC SOLUTION TECHNIQUES FOR SOFT
TIME WINDOWS
1.

57

58
58

Nearest Neighbor
SCCO
SCAO

2.

59
61

3.
C.

EXACT SOLUTION TECHNIQUES FOR SOFT TIME

WINDOWS
1.

b2

State-Space Relaxation Procedure

2.

Additional Condition

..... 62 .....66
69
70

3.

Branch and Bound Procedure

........

COMPUTATIONAL EXPERIENCE A. TEST PROBLEMS


E.

73 73

COMPUTATIONAL RESULTS
1.

Hard Time Windows

,...--

-73
75 77
79
80
.

2.
VI.

Soft Time Windows

CONCLUSIONS AND RECOMMENDATIONS


A B

APPENDIX
APPENDIX

5ESI PROBLEM [1

TEST PROBLEM

[2].....

APPENDIX C
APPENDIX
D E F G
H

TEST PROBLEM [3

].....
]

TEST PROBLEM [5]


TEST PROBLEM [6

.82
83
84 35

APPENDIX
APPENDIX

TEST PROBLEM FOR THE SCCO


TEST PROBLEM [1-1] 1EST PROBLEM [1-2]

APPENDIX APPENDIX

36

APPENDIX

TEST PROBLEM [1-3]

87

APPENDIX
APPENDIX

J:

TEST PROBLEM [1-4]


TEST PROBLEM [2-1] TEST PROBLEM [2-2] TEST PROBLEM [2-3] 1IS3 PROBLEM [2-4]
-

88 89 90
91

K:

APPENDIX L:

APPENDIX

M;

APPENDIX N:

92

APPENDIX 0:

TEST PROBLEM [3-1]


TEST PROBLEM [3-2]

93
94

APPENDIX

P:

APPENDIX Q:

TEST PROBLEM [3-3]


TEST PROBLEM [3-4]

y5

APPENDIX
APPENDIX

R:

...

9b
97

S:
T:

1ES2 PROBLEM [4-1]


TEST PROBLEM [4-2]

APPENDIX
APPENDIX

93

U;
V:

TEST PROBLEM [4-3]

"
10Q
101

APPENDIX

TEST PROBLEM [4-4]

LIST OF REFERENCES

INITIAL DISTRIBUTION LIST

103

LIST OF TABLES

COMPUTATIONAL BESULTS OF CCCO, CCAO COMPUTATIONAL BESOLTS CF THE HAED TIME


WINDOWS

,31
74

II

III

COMPUTATIONAL BESULTS CF THE SOFT TIME WINDOWS

76

LIST OF FIGURES

1.1

Example of Nonconvexity of (1.10) Dimensions

in Two
15

2.1

Concept of the Clarke - Wright Savings Heuristic Initial Subtour and Insertion
Intermediate Subtour and Insertions
Final Tcur of CCAD

..18
25
26

2.2
2.3
2.4
3.1

........

27

3.2

Diagram for Hard Time Window Case Current Tour before Modified-Oropt

.32
37
38

3.3 3.4 3.5 3.6

Improved lour after Modified-Oropt


Subtour in SCCO Procedure

39
40
41

Intermediate Subtour in SCCO Procedure Final Subtour for SCCO Optimal Route or Four Nodes Problem
Diagram for Soft Time Window Case

3.7
4.1

........
.....

52 58

5.1

5.2

Unconstrained Solution Obtained by CCAO Unconstrained Solution Obtained by Nearest Neighbor Heuristic

.71

....72

I-

INTfiODDCTION

A-

OVEBVIEW

Consider a traveling salesman having to visit n cities or customers. He starts from a depot and needs to visit each other n-1 cities only once and then return to the of the
depot.

The

ccst of traveling
tc j,

between any pair


time or cost,
c?;

of cities

(expressed in terms of distance,

etc),

say
The

from city

is given as

in a cost matrix C.

through the n cities that would problem is to design a tour This is known as the minimize the total cost of the tour.

Traveling Salesman

Problem which is a

well-known classical

operations research problem.


The TSP is called Euclidean when the cities that must be

visited all lie on the same


them.

plane and the cost of traveling

between any pair of cities is the Euclidean distance between


The TSP is an NP-complete problem

Ref .

1 r

2].

All known

exact solution methods have a rate of growth of the computation time

which is exponential in

n.
a

On the

other hand,
the
n

heuristic

solution methods
which is
a

have
lew

rate of

growth of

computation time
reason,

order polynomial in

and

have been experimentally observed to perform well.

For this

there has
this
thesis

been an
we

extensive
consider
t;

amount of
adding
time

research window
tL <

directed at TSP heuristics.


In

constraints to the TSP.


salesman visits
where
a
1L

That is, if

is the time tnat the


l' L

city

i,

then

t^must satisfy

<

u^

and

are the

specified lower and upper bounds of


not as well studied as the
teen
a

time

window.

This problem is

unconstrainted TSP,
used on the

but there have

few approaches

problem.

Psaraftis [Ref. 3] has presented a dynamic programming model and solution procedure for two dial-a-ride problems, Baker which are similar to time-window constrained TSPs.
[Ref.
4] has presented an exact algorithm using a branch and

bound

procedure
state

which

is

effective
5]

for

very
to

small

n.

Christofides
programming

et al.

[Ref.

have

presented a
procedure

dynamic compute

space relaxation

bounding information within a branch and bound algorithm. The objective of this study is to develop exact and
heuristic algorithms which
will provide an optimal
each
a

or near

optimal
window.

tour
He

that visits
are given
n

city

in its
a

given
set of
-

time

depot
d

location,

x,y

co-ordinates for
A

cities an

set of time windows

common
A

application of the
set of

TSP is in

veiiicle routing

problems.

customer orders
Given
and
a

must be

partitioned

among several vehicles.

partition, the problem tnen

decomposes into one


of large TSPs

TSP for

each vehicle.
in deference to

Because of this
the difficulty
(less

prospective application

(with or without time constraints), we confine


to small-scaled problems

our research and computation

than

nodes).

We

consider

two

different

kinds

of

time

window
Hard

ccnstraints;

hard time windows and soft time windows.

windows cannot be
can

violated.

Soft windows can


for each customer,

ce violated,
The penalties

but a penalty cost must be paid when they arebe defined individually

and they can

differ for early and late arrivals.


than tne

Generally, the penalty


boundIn

for arriving before the lower time window bound is much less

penalty for arriving

after the upper

Chapter III, we present the hard time window approach and in


Chapter IV,
we present the soft time window approach. We developed

several Fortran
heuristics,

programs for

solving the
For the time

TSP and time-constrained TSP.


[Ref.
b]

For the TSP, we use Stewart's

recent

CCCO and CCAO.

10

constrained ISP
seme of which
the unconstrained

problems,
problem.

we develop some

new heuristics,

are modification of Stewart's


We

heuristics for
exact algo-

also developed

rithms for both hard and


al. 's

soft windows using Christofides et


of state-space-relaxation

[Ref.

5]

method

dynamic
in

programming and

branch and
a

bound.

This

is described

Chapters III and IV. Finally, we describe


programs.
Tie

hybrid of the heuristic and exact


the overall

hybrid uses
but
the

structure of

the

exact program,
heuristic.

upper bounds are

obtained with the

This is discussed in Chapters III and IV.

B.

THE TfiAVEIING SALESMAS PBOBLEH


A

tour is

chain which passes through all

the n
A

nodes
tour

and in

wnich the first and the last nodes coincide.

is also known as a Hamiltonian cycle.

Let

tour be denoted by
this tour be
n-1

t =

(i

12
,

,.

. . ,

i
1

and

the cost of

C (t)

*>""

c
i i
n
1

**->
j

i
j

=1

j+1
)

Here

12
,

,...,i
n

is

permutation of
in whica

the

inteaers
cities are

from

to n,

giving

the

order

the

visited.
The

Traveling
Given
a
)

Salesman
graph
G =

Problem
{

can
}

be

defined

as

follows.

nodes N,
(distance
TSP is

set of arcs A
c>

composed of a set of connecting these nodes, and a cost


N,A
(i, j)

associated with each arc

in A.

The
of the

the
N.

protlem of finding the minimum cost

tour of the

nodes in

The following mathematical formulation

TSP is from Stewart Ref. 6].

11

MIN

y>

ij

ij

(1.1)

1#

S.T
j

1, ..

(1-2)

13
i =

1,...,n

(1.3)

y-y..

i> 5i
-u x
x ij
y

i =

1,.. .,n

(1.4)

<

i =
j

2, .-

. ,

(1-5)

13
=

1,-.. ,,n

i*j
= O
>
r 1

for all

(i, j)
(i, j)

(1-6)
(1,7)

for all

ij

where
x
iJ
\

= /

if

arc

(i, j)

is on the toar

otherwise
tour
)

are continuous variables that force the final


ij

solution to be on the
(

i.e.
a

include every node in the same route


>

is

number

n-

and

is the number of nodes in the set N.

and (1.3) ensure that each node will be visited exactly once, while constraints (1.4) and
The constraints (1.2)
(1.5)

force

the final
1

solution to be
(depot)
.

single

tour that

starts and ends at node

This formulation is not


but is of interest in a

directly used in our TSP programs,


general discussion of the problem.

12

C.

TSP WITH 1IHE WIHDCW COHSTRAINTS


The

time-ccnstrained Traveling Salesman Problem is a variation of the TSP that includes time window constraints
on the time
to visit some

of

the cities.

The

hard time-

constrained TSP is to find the minimum cost tour subject to visiting each city within its time window.
"For the

time-constrained TSP model ,


variable,
i.
t^
,

we define a contin-

uous nonnegative

to be

the time

that the

salesman visits city


city
1

Since the salesman


the tour,
t n+
,

must return to
the formulation
the total

(depot)

at

the end of

includes
We

an additional

variable,

time

reguired to complete the tour-

assume

that a
|Cij
J,

complete,
is

symmetric,

nonnegative
a

distance matrix,

known and that time is

scalar
may be

transformation cf distance.
used interchangeably.

Thus time and distance

mathematical formulation of the time constraints is from Baker [2ef. 4].


MIN S.T
i

The following

TSP

with

t
1

(1.6)
>

n+1
t t
1

i = 2,3,
1i
j

,n

(1.9)

i
t

>

= 3,4,.. .,n
< i <
j

(1.10)

>

c
1i
i

i =

= 2,3,
1

,n

(1-11)
(

n+1
t

i
>

,2,

,n+1
,n

1. 12)

1 i

<

t
i

<

i = i

2,3,

(1-13)

where

=
i

the time that the salesman visits city

|x|

= the =

absolute value of

the shortest time reguired to travel from

city

to

city

13

1
i

the lower bound on the time window for the

salesman to visit city


by assumption all
1 > i =

the upper bound on the time window for the

salesman to visit city i


u

> 1 i
i

for all i

through (1-12) ensure a nonnegative arrival time at city i, t- , be obtained for each city
(1-9)

The constraints

through node guarantees that t ,


(node
2
;

u)

on

the tour-

The constraint
value.

(1-9)

the time that the


t

salesman leaves the


The absolute

node

(depot)

will be the smallest


(1.10)

value constraint
any two
the

ensures that the arrival

times of

city differ by amount of time sufficient to allow salesman to travel between the two cities. The
(1.11)

constraint
returns to

guarantees that
will be
and
(1.13)

t^

the time the salesman


t

the depot,
(1.12)

the largest

value-

The

inegualities

are nonnega time window constraints respectively.


Unf ortuna tely.

ti vi ty

and the
time-

Baker's
is

proposed

model
to

for

the

constrained

TSP

very

difficult

solve,
we

because

constraint (1.10) is nonconvex. Therefore, this formulation directly in cur programFigure


(1-10)

will not use

1.1

illustrates the nonconvexity


the example
|

of constraints
|

for

one i,j pair,

t - t

>

5.

The

region for this constraint is the union of two disjoint sets. Taken all together, constraints (1.10) m define 2 disjoint sets sets where m = (n-1) (n-2)/2, which are very difficult to work with.
feasible
We

can

see

that

the

time-constrained

TSP

is

very

different from ISP, even in formulation.

Mi

10

,-

T4

10

Figure 1-1

Example of Nonconvexity of (1.10) in Two Dimensions-

15

II. HEDBISTIC TSP SOLOTION

A.

OVERVIEW
Many

been developed for procedures have solving TSP. Our purposes in this Chapter are to examine some of the well-known heuristics, to review Stewart's [ Hef

heuristic

6] recent heuristic,

and to compare these approximate techand accuracy on


a

niques

en

the basis of efficiency

small
by

number of exanples.
In general,

heuristic procedures
tour

are categorized

three broad

classes:

construction procedures,
a

tour
7 ].

improvement procedures,
Tour construction
ment

and composite procedures [Ref-

procedures start with


a

single

node and
an

successively add nodes till

tour is built.

Tour improve-

procedures attempt

to

find a

tetter
a

tour given

initial tour. Composite procedures construct


from one of

starting tour

the tour construction procedures and then try to

find

better tour using one cr more of the tour improvement

procedures.
1-

Tour Construction Procedures

There are many methods available for constructing an Procedures which have been generally used are initial tour.
given below.

a.

Nearest Neighbor
1.

Rosenkrantz et al. [Ref. 3]

Step

Start with any node as the beginning of


a

subtour.

Step 2. Find the node closest to the last node

added to the subtour. Add this node to


the current subtour.

16

Step 3. Repeat step


node.

until all nodes are contained


join the first and last

in the tour. Then,

b.

Clarke and
[fief.

Wright Savings

Clarke

and Wright

9]

Step

1.

Select any node as the central depot


which we denote as node
1.

Step 2. Compute savings

= c

c 1i Ij

ij

ij

for

i, j

2,3,...,n. i*j

Step 3. Order the savings from largest to smallest. the Step 4. Starting with the largest savings on
list, subtours are

assembled such that the


the largest
a

next node added has

remaining
and
.

savings - provided that


violated. Once
a

constraint is not
i
j

pair of nodes

have

been linked, they remained linked

Repeat until all nodes have been assigned.

amount of travel as opposed to saved if node j is visited directly after i, having separate trips from the depot to nodes i and j.
Here,

tne guantity

s.

is the

Figure 2.1 demonstrates the procedure for two nodes


c.

and j.
[Ref.

Insertion procedures
8] Ac
)

Rosenkrantz et al-

algorithm constructs a feasible tcur by successively adding one node to an existing subtour. This procedure takes a subtour of k nodes at iteration k and attempts to determine which node not in the subtour should
insertion
join the subtour next
(the Selection step).

mines where in the subtour

Then it deterit should be inserted {Insertion

17

i
1

n
\

-*P
/

x
\

x
\ /

\
\

\
\ / \
/

/ / / / /

\
\ /

\ \ \ \
\
/
/

/
/ /

Before

Linking

After

Linking

Figure 2.1

Concept of the Clarke - Bright Savings Heuristic.


6]

step).

Stewart

[fief.

presented the

following general

algorithmic structureStep
1.

(Initial
N 'C N

Suttour)
a

Obtain a IS P tour for


in G-

subset of the nodes

Step 2.

(Selection Step)

Find

node

N-N

'

tc be added

to the

existing suttour.

Step 3.

(Insertion Step)

Choose an arc(i,j)

in the
i

subtcur on N*.
j

Insert node
k

between

and

and

add

to
N

N'

Step 4. If

N,

then stop-

(We have a

HamiltoniaE cycle).
2.

Otherwise, return to step

18

There are

many variations

on this

algorithmic
for executing

structure depending
steps
1 r2

on the procedures chosen

and 3 . Wiorkowski and McElvain [Ref. 10], Or [fief. 11], Stewart [Ref- 12] and Norback and Love [fief. 13] all present insertion algorithms that use the convex hull of the set of Nemhauser and Hardgrave nodes N for the initial subset N .
1

[fief.

14] have shown

that there

exists an optimal tour for


This

every Euclidean TSP in which the relative order of the nodes


on the boundary of the convex hull is preserved.

means
of the

that the

optimal tour visits codes

on the boundary

convex hull in the same order as if the boundary itself were followed.

Further justification for the


hull
for
the

use of the convex

initial

subtour

is

shown

empirically
and

by

Stewart's [Ref.
several

6] computational

experiment.

He compared

insertion

heuristics

both with

without

the

convex hull as the starting solution.

The results show that


by the use of the

all the insertion algorithms are improved

convex hull.
moderately.

Some are improved substantially,

others only

Many criteria have been suggested for the selection of the node to be inserted in an insertion procedure.
( 1
)

ear est
k

Neighbor
=

Rqsenkrantz et ala

Ref.

8] ).

Choose the node


,

that is nearest
k

node in the
s.t-

current tour. I. e-

find

argmin
j

jN-N,

ieH ..
(2)

ij

Cheapest
Choose the

In ser tion
node
k

fiosenkrantz

et

al.

[Jtef-

8]

that may be

inserted at
i, j N*

minimal
k =

increased cost. I.e., find argnin c +c -c s.t. me N-N


m

im

mj

ij

19

(3)
r

Farthest

Insertion
k

Bosenkrantz
s.t.

et

al.

Ref. 8])-

Choose the node


I.e., find

that is farthest from a node


k =

current subtour.

argmax c
3

jN-N*,

ie

ij

(*4)

Arbitra ry
k

Insertion

Ref . 8]

).

Choose node
(5)

Rosenkrantz et randomly from among N-N*.


(

al.

Rati o Insertion
k

Stewart [Ref.
(c

J2

).

Choose the node


s.t- m 6 N-N

such that the proportional increase in find k


-

ccst is minimal. I.e.,


,

argmin
m

/ c
ij

im

mi

i,

(6)

Pe rpendicu lar
)
-

Distance
k

Wiorkowski

and

McElvain [Ref- JO] . Choose the node an arc in the current subtour.
(7)

that is closest to
Or [Ref.

Rat io Ti mes
x

Distance
the
,

11

).

Choose
=

the node

such that
I.

product

of ratio

and

distance is
k

minimized.

e.

find
)

argmin
m
IE

((c
im

mj

)/c ij )x(c im +c mj -c ij

N .
(

s.t.

N-N'
(8)

i,

13

)-

Greatest Angle No rback and Love [fiefChoose the node k and arc i f j such that the angle
by
=

formed
find
k

the two arcs (i,k)

and

(k,j)

is a maximum.
}

I.e.,

argmax
ID

angle{ arc(i,m), arc{m,j)

s.t.

mN-N',

i,j N

The insertion criteria that have been used

fall into two categories. [Ref. 6]


1.

Cheapest Insertion
Insert the node
kfc

N-N* between those

two

connected
c
+

ncdes i,j
- c

N'

that

minimize the

guantity

ik

kj

ij

20

2.

Identical Insertion and Selection


Do

selection and

insertion

in the

same

step.
2-

Tour Improveme nt Pro cedures


The best known

procedures of this type


[Ref.
7].

for the TSP

are the branch exchange heuristics

These branch

exchange heuristics work as follows.


Step 1.
Find an initial tour.
cr it may

This tour may be chosen

randomly from the set of all be generated


the tour

possible tours,
the tour

by one of

building procedures above.


Step 2. Improve

using one
until no

of the

branch

exchange heuristics.
Step 3. Continue step 2,

additional

improvement can be made.


For a

given

we define
k

k-change of a
a

tour as
A

consisting of the deletion of


replacement by
is
k

branches in
form
a

tour and their

other branches to

new tour.

toar
via
a

k-opt if
is

it is not possible

to improve the tour

k-change.
likely
it

In general the larger

the

value of k,
will
be

the

more
test

that

k-opt

solution
n

optimal.
to

Unfortunately,

the

number of operations necessary


r

all k exchange is proportional to

where
7].

is the number
k

of nodes in the TSP.


and k =
3

Due to this complexity,

values of

are most commonly


for k>
3

used

Ref

The 2-opt and


15] and tiie

3-opt heuristics were


k-opt procedure,
[

introduced by Lin [Ref-

was presented by Lin and Kernighan


designed
a

Kef.

6]

Or [Ref.

11

has

modified

3-opt that

considers
This

only

small percentage

of 3-branch by

exchanges.
[Ref6]

modified

3-opt

called Oropt

Stewart

21

considers onlj those branch exchanges


a

which are composed of

string of one, two, or three adjacent nodes being inserted between two ether nodes in the current tour. 3y limiting the
number of exchanges that are

considered in this way,


a

Oropt

reguires many fewer calculations than


Stewart [Eef.
6]

full 3-opt.
the convex

made an experiment of
(CCA)

hull, cheapest angle insertion algorithm

which will be

discussed in the next section as a stand-alone algorithm and


with each of the three post- processors.
The algorithms are

designated CCA,

CCA2,

CCAO,

and

CCA3 for the convex

hull

cheapest insertion stand-alone,


with 3-cpt respectively.

with 2-opt,

with Oropt and

He drew
First,

two conclusions from his


2-opt
or
tne

computational results.
tially

the 3-opt reguires substanOropt.

more

time

than

either the

Second, the

2-opt is dominated by the Oropt and the 3-opt in

guality of solution.
In

computation
3n
2

time,
n

Oropt

only
3-opt

looks

at
on

approximately
eaca pass.

of the
n

possible
to

exchanges

There are

ways

select tne first

branch,

times

ways to select the


This accounts

second branch,

and n-2 ways to times for


the

select the third branch.


for the

fairly close

2-opt and Oroft.


CCA2 solutions.

The

guality of CCAO

solutions dominate

On the other hand,

there is little or no

difference between the Oropt ani


guality

3-opt in terms of solution


the above experiment
a

Stewart's main conclusion from


is that

the Orcpt performs

as well as

3-opt in
a

small

percentage of the computer time reguired by


should be

3-opt,

and it
for

preferred tc

both the

2-opt and

the 3-opt

Euclidean TSP's.

22

3.

Com posite Proced u re


The basic

composite procedure is a combination of It is the tour construction and branch exchange procedures. appending a branch exchange procedure to the obtained by
tour construction algorithm as a post- processor.
dure can be stated as follows
[

The proce-

Ref .

17].

Step 1- Obtain an initial tour using one of the

tour construction procedures.

Step 2.

Apply a branch exchange procedure to the

solution produced by the step 1. Stop wnen no further improvement can be made.

relatively fast computationally and gives good results [Ref. 18].


The composite procedure is

E-

CCAC
1

Algor ithm
The GCAO algorithm designed by Stewart [fief. 6] uses

the convex hull

of the nodes in

for

its initial subtour.

Then it inserts the nodes not currently in the subtour where


they may be inserted most
It

cheaply

(the

Cheapest Insertion
at each

criterion).
two

selects the node k


must
be

tc be inserted

iteration according to
arcs
that
(Selection criterion)
an Oropt to
in

how large an angle is

added

to

the

formed by the current subtour


Finally it uses

in order to insert k.

make local improvement on


stage.

the tour constructed

the first

CCAO

means

Convex Hull,

Cheapest

Insertion,

Angle Selection, Oropt.


:

Algorithm

CCAO
:

Input

Number of nodes,
nodes.

x and

co-ordinates of all

Output: Ordered list of tour, total cost.

23

Step

(Initial Subtour)

Find the convex hull of the set of nodes

N.
N
1

Call the
in the
Step

set of nodes on the boundary


nodes of

Let the initial subtour be the

N*

same

order as

they

appear on the

convex hull.
2
.

(Cheapest Insertion)
For each node mN-N', find
(i
m
,

=
j

argmin
i,j

N*
,

c
ij

im
i
r

mj
j
:

s.t.

i,

connected.

Step

(Greatest Angle Selection)

For the next


that
(i
,

insertion, select

the

node

m)

maximizes the angle between the arcs over all m 6 N-N'. and (m,j
)

I.e,

find

argmax angle
m

(i

mm
,m),(a,j
and add

s.t.

m N-N.
k

Insert
k

tetweeh i
k

and

to N'

step

If N = N,

go to step 52.

Otherwise return to step


step
5
;

Apply an

Oropt to

the current tour.

Stop

when no further improvements can be found.


End of algorithm CCAO
2
.

Exa m ple

illustrate the above algorithm on First the TSP defined as test problem [ 1 ] in Appendix A. the convex hull is generated for an initial starting subtour. This subtour consists of nodes 2,13,12,14,5,15,7,4..
2- 4 A

Figures 2.2 -

solid line marks the boundary of the convex hull in Figure

2.2.

24

CO

CV2

is

>-

007
1

co-

-*-

<M-

10

12

14

16

18

X
Figure 2.2
In step 2,

Initial Sabtoar and Insertion.


the interior nodes

each of

is associated witn a pair of

(1,3,6,6,9,10,11,16) connected nodes on the initial

suntour

lines in Figure 2.2). the In step 3, dashed lines that form the greatest angle (closest to 180)
(the dashed in this example).

identify the node to be inserted (node 10

Figure 2.3 shows the problem after the first three insertions ncde 10, node 1 and node 8 in that order). Notice that some nodes not in the subtour are associated
(

25

10

12

14

16

18

X
Figure 2.3
with new node

Internediate Subtour and Insertions.


pairs.

Figure 2.4 shows the final tour for stage ore. This tour is now passed to an Oropt postthis case the tour from stage one appears processor. In from inspection to be optiial, and Oropt will find no
improve
irent

26

Figure 2-4

Final Tour of CCAO.

3.

Computationa l
In

R esult s

addition
Oropt)

to

CCAO,

CCCO

(Convex,

Cheapest,
is that

Cheapest,
ison.

has been coded for the purpose of comparCCAO and CCCO

The
uses

only difference fcetveen


the

CCCO

cheapest

selection
Eef .

criterion

instead

of

greatest angle of CCAO. We used SedgevicJc's


rithm
for

19] package wrapping algo-

finding

the

convex

hull

(initial

subtour)

27

Starting with some point

(called the anchor)


(say

that is guaran-

teed to be on the convex hull


y

the one with the smallest

co-ordinate), take a horizontal ray in the positive direction and sweep it* upward until hitting another pointThis pcint i s on the hull. Then start at that point and continue

sweeping until hitting another point,

etc.

The package is

completely wrapped when the first point is included again. convex hull of an array The following algorithm finds the
1(1,..., a)
that is,
a

of nodes,

the node L(n+1)

is used as a sentinel,

copy

of the first node which is

used to signal

completion of the procedure.

The variable NH is maintained

as the number of nodes so far included on the hull-

Algorithm

Package Wrapping
:

Input

Number of nodes, x and


nodes.

co-ordinates 01 ail

Output: Ordered list of

convex

hull and

number of

nodes included on the convex hull-

Step

(Initialization)

find and
NHIN =

duplicate anchor.

I.e.,
K

find
set

argmin
l(n +

s. t. i

and

NH =

0,

1)

L (NMIN)

Step

(Swap nodes NH and NMIN).

Put

last

node

found into

the

hull

by

exchanging it with the NHth node.


NH = NH
1.

TEMP
L(NH)

L (NH)

= L (NMIN)

I(NMIN) = TEMP.
Step
3
:

(Compute angle)

Compute the angle from


28

tne horizontal made

by the line

between L(NH)

and each of

the

nodes not yet included on the hullStep


4
:

(Find next hull node)

Find the node whose angle is smallest amcng

those with
value of the

angles
to the
-

bigger than tne current


angle (the angle from

'sweep'

horizontal
and L(NH)
Step 4
:
)

line

between

L(NH-1)

Stop
again.

when the first


I.e.,
L(n+1)

point is
= L(NHIN)
2.

encountered
.

Otherwise, go to step
End of algorithm Package Wrapping

We

used Sedgewick's
in step
3,

smallest angle
4.0 that is

Pseudo Angle for finding the which is coded as the 'THETA'


a

function. This function returns


net

real number between 0.0 to


ny L1 and L2 with the horias the true

the angle made

zontal but which


angle.
If

has the same order properties


dy are the delta x and
y

dx and

distances from
the
at

some node to the anchor node,

then the angle needed in this

algorithm is

arctangent

dy/dx.

However,

arctangent
least two

function is likely to

be slow and it leads to


to

annoying extra conditions

compute
is in.

whether dx is zero,

and which guadrant the point


In this algorithm we

only need to be able tc compare

angles, not measure them. Thus it makes sense to use a func-

tion that is much easier to

compute than the true angle but


A

has the same crdering properties as the true angle-

good
dx)
.

function is siaply dy / (dy + Testing for exceptional conditions are still necessary,
such
a

candidate for
r.

but

simple

Function THETA
Input
:

Pseudo Angle

dx,dy

(delta x and

distances from some

node to the anchor node)

29

Cutput

Pseudo

angle made by 11 and L2 with the

horizontal line.
begin
dx =
x (L2)

- x(L1)

ax = abs(ax)
ay = abs (ay)
)

dy =
if
(

y(12)

- y(L1)
)

dx=0

and
then

dy=0
t =

then

t = t =

0.0

else
if

dy / (ax
t

ay

dx

<

2.0 - t
<

else if dy
end

then

= 4-0

End of function THETA

way.

Figure 2.2 shows how the hull is discovered in this We used Sedgewick's Pseudo Angle for finding the
The

greatest angle selection point.


data for

our

test problems

is

given in

the

Appendix.
I.

The computational results are summarized in Table


be seen in Table I,

As can

CCAO is taster than CCCO on


(below
30 nodes
)

the small-scaled test problems


is faster than
(over
5

nut CCCO

CCAO on the moderately


.

large sized problems

nodes)

Generally,
[Ref.
6]

the

accuracy is almost idenin

tical in both cases.

Stewart
problem,
use a

showed that
Thus,

large

scaled

the CCAO algorithm outperforms any other insertion


we are highly motivated to

and selection algorithms.

modification of

the CCAO

algorithm for

solving the

time-window constrained TSP-

30

TABLE I
CCHPOTATIOBAL BESDLTS OF CCCO, CCAO

CCCO
Problem Number Number of Nodes n
[1]
[2]
16

CCAO
or

Best Known

Solution
66.603 9

Over Best

CPU Time
(sec)

Over Best

CPU

Time
(sec)

0.00 0.0133
0.00 0.0233 0.00 0.0166

0.00 0.0066

22
22
51

469.0288
278.4371

0.00 0.0100
0.00 0.0066
3. 94

[3]
[5]
[6]

429.7000

2.72
1-

0.

1897

0.2562

76

552.9000

64 0.5857

1.54 0.6889

CPU times in seconds on IBM 3033-

51

III.

1JE ISP WITH HAR D TIHE WINDOW CONSTRAINTS

A.

INTBODUCTION
The first time-constrained TSP

in which late arrivals are

consider is the case not allowed, and early arrivals


we

must wait for the opening of the time window before they can

begin to service

customer-

This is called

the hard time

window case and it is illustrated in Figure 3.1.

cannot be violated

>j

waiting time

l<

upper bound
time window for city i

salesman arrives

lower bound

J__

Figure

3-

Diagran for Hard Tiie Window Case.

The hard time window case corresponds to military opera-

tions and to some civilian distribution problems.

Meeting

deadline is considered

critical
{N,A}

factor in this case.

The

soft time window case will be .discussed in the nextChapter.

Consider
some notation

graph G

composed

of a set of nodes

and a set of arcs A

We now define connecting these nodes. to be used throughout cur discussion of the

time-window-ccnstrained TSP.

32

Lower bound on the time window at node i (early allowable arrival time at city i) . Upper bound on the time window at node i
(latest allowable arrival time at city
i)
.

d i

Time required to spend at node


(service time at city
i)

i.

SPEED =
dist
ij
c
=

Constant speed
travels.
=

at

which the

vehicle

Distance from i to j.
i

Iravel time from


Note
:

to j.
/ SPEED.

ij
c ij =

dist
ij

We use c
ij

and

c(i,

j)

interchangeably,

depot = Depot (home)


I = =
(

node.
).

L(1) ,L(2) ,...,L(n)

tour with n stops visited in the order


1(1), 1(2) ,.-.,L
(n)
.

ARRV1
i

= =

Arrival time at city


Waiting
window.

i.

MAI T
i

time at node i

for the hard time

He also use l(i),


1
i
r

u(i), d(i)
i

ARRVT(i), WAIT(i)

and

d
i

ARRVT
i

WAIT

interchangeably.

B.

HEURISTIC SOLUTION TECHHIQUES FOB BARD TIME WINDOWS


1.

Nearjst Neighbor
The

following
a

is

Nearest

Neighbor

heuristic
At

similar to the one used in

the unconstrained TSP.

each
It

iteration we add
is the

new node tc

the end of the subtour.

first node that can be

visited from the last node of

33

the sub tour

taking into account any waiting time that might


due to the lower time window bounds.

te necessary

Algorithm
Input

Nearest Neighbor
:

Number of nodes,

and

co-ordinates of all

nodes, time windows for all nodes-

Output
Step
1

Ordered list of tour, total travel time.


(Initialization)
Start at the depot.
Let i=depot,
N

{i}

Step

Compute ARRVT for all nodes


be visited directly after i
:

k N-N

if

can

AERVT = max
It

AERVT
i
,

1 i

+ d + C i ik
('

Step

If ARRVT
k

>
1
)

then stop

no

feasible

k
.

solution
Step
4
.

If AERVT < 1
k
k

then
=
k

cost =
k

1
k

Otherwise,
Step

cost

ARRVT
k

Nearest Neighbor Selection)


cost
k

Choose the node k N-N' such that


a
k

is

minimum.
=

I.e,

find
s.t.
J

argmin

cost

6N-N'.

Step

(Insertion)

Insert
Step
Step

after i, add

to subtour N',

and

let i = k7

If N'=N r go to next step.

Otnerwise, return to step


8

2.

Compute total travel time, then stop.


Total travel time
d
+

max

ARRVT
k

1
k

k, depot

End of algorithm Nearest Neighbor

34

. .

This

solution was

constructed by

starting at

the

depot and
has not

moving to the nearest neighboring customer that long as the upper bound level yet teen visited as
This heuristic
may fail to solve

was

not violated.

the

problem

2.

SCCO

This algorithm is designed for the case when some of We call these nodes " the nodes do not have time windows.
time free ".
SCCO is similar to the cheapest selection,

insertion method for the unconstrained TSP,


nodes with
time windows

cheapest except that the

are treated

time free

nodes.

The nodes

differently from the with windows are inserted in

order of increasing upper time window tound.


The time free nodes are inserted between these nodes

by cheapest selection and cheapest insertion,


the upper

for as long as
In

bound time

window will allow.

the

end,

Modified oropt is used to improve the solution.


There is one possible difficulty with this approach.
some become impossible to reach ccnstrained nodes before their upper bound.
It may

of

the

tiae-

In this case,

we must delete some node


see an

(s)

from the subtour.


be

Whenever we
we

upper oound that cannot

satisfied,

select a

node to delete ty the following criteria.


The first criterion is the width of the

time window.

Hence,
window,

time-free
we

several nodes

considered first. in the subtour are tied for the


nodes are
saved.

Then,

if

widest time
as

select for deletion the node that results in the


The

greatest
fellows

time

algorithm

is

summarized

Algorithm
Input

Successive Cheapest Cheapest Oropt (SCCO)


:

cumber of nodes,
35

and

co-ordinates of all

nodes,

time windows for all nodes.

Output
Step
1

ordered list of tour, total travel time(Initialization)


Start at the depot.
let

i=depot,

{i}.

Step

Set k = argmin u
J

s.t.

J6N-N
then set

If k is time free node,

k =

depot.

Step

Calculate ARRVT
k

ARRVT = max
k

ARRVT
i
,

ik
5.

Step

If ARRVT
k

<

u
k

then go to step

Otherwise, select time free node


tion. Delete node

which
dele3.

results in the greatest time saved for


m

from N*, go to step


N-N
1

Step

5
6

.
.

Add

node
by

to the subtour N*.


j

Step

Insert time free node


and k

between nodes
and
k

cheapest insertion
(same as CCCO)

cheapest

selection
exceed
u k

until ARRVT does not

If ARRVT
k

<

1
k

then

set ARRVT =
k

1 k

Step

If N'=N,

then go to next step.


i =
k

Otherwise, let
Step
8
.

and go to step 2.

Apply the Modified Oropt

procedure to the

current tour.

Stop when no further improve-

ments can be found.


End of algorithm SCCO

"Successive" means
the smallest

select the node


In the SCCO

upper

bound.

successively by algorithm, if the

36

salesman arrives before the lower bound of the time window, we set the arrival time equal to the adding waiting time,
lower bound.
The Modified Oropt procedure for- improving the solu-

This procedure consider only those exchanges that would result in a node being inserted between two other nodes in the current tour.
tion is described below.

10

11

12

X
Figure 3-2
Figures 3-2

Curreat Tour before Hodified-Oroptand 3.3 are

helpful to understand how the procedure works. In both figures, i,j,k,l, and m are the nodes in the current tour. Nodes 1 and m are considered to

37

be adjacent tc node

Jc.

test is then conducted to detersuch

mine if node
as i and
If it
j,

can be located between tvo other nodes,

sc that it results in reduced total travel timewe

can,

make the

appropriate arc

exchanges,

then

update the total cost and route orders.

C\2

1
i

_
i

i
j

1 1

1
i

os-

\
03t-

>'O<*CO

C\2

10

11

12
1

X
,

Figure 3*3

Iaproved Tcur after Modif ied-Oropt.

In this example,
<k,m)

the three arcs (i,j)


(i # k),

^k,l)

and

are removed and replaced by

(j#k),

and (l,m).
the algo-

When no further

exchanges improve the solution,

rithm terminates.

38

Figure 3.4
Figures
the TSP
3. 4 -

Subtour in SCCO Procedure.


3.6 illustrate

the SCCO algorithm for

with hard time windows given in Appendix F as in test problem [1]. In this problem 10 of 16 nodes have time windows. The ether 6 nodes are time free.
First, the subtour starts at the depot (node
16)

and

we insert the node with the

smallest upper bound (node 12).


be inserted between 16 and
on node 12 is

We examine all nodes which could


12 as long

as the upper bound

observed.

In

39

Figure 3.5
this case

Intermediate Subtoar in SCCO Procedure.


node.

there is no such

Then we select

the next

smallest upper bound (node 1 4) , add it to the tour, look for nodes to insert before it, and continue in this manner. Now
we have formed the partial tour 16,
12, 14,

11,

6,

as

shown in Figure 3.4.


As shewn in

Figure 3.5,
6

we can insert

time free

nodes between node

and node

3.

These insertions are made

according to

the cheapest insertion and

cheapest selection

40

Figure 3.6
criteria.
He

Final Subtour for SCCO.


any further

do not make

insertions because
3.

they would cause a time window violation at node

Figure
heuristic.

3.6

shows

the

final

tour

for

the

SCCO

This tour is passed to a Modified Oropt, but in

this case it will find no improvement.

41

3.

SCAO
This heuristic is

use of the greatest angle

identical tc SCCO except for the selection criterion for the time-

free nodes,

instead of cheapest selection.


:

Algorithm
Input

Successive Cheapest Angle Oropt (SCAO)


:

Numier of nodes, x and

co-ordinates of all

nodes, time windows for all nodes.

Output
Step
1

Ordered list of tour, total travel time.


(Initialization)
Start at the depot.
Let i=depot,
N*

=
u
D

{i}.

Step

Set k = argmin

s.t.
set

N-N'.
=

If k is time free node,

depot.

Step

Calculate ARRVT
k

ARRVT = max
k

ARRVT
.i
,

1
i

ik
5.

Step

If AivRVT
k

<

u
k

then go to step

Otherwise, select time free node mN' which

results in the greatest time saved for deletion.

Delete node

from N', go to step


N-N'

3.

Step

5
6

.
.

Add node k to

the subtour N'.


j

Step

Insert time free node


and k by

between nodes
and
until

cheapest

insertion

greatest
ARRVT
K

angle selection (same as CCAO)


does not exceed
If ARRVT
k
u

k <

1
k

then

set ARRVT =
k

Step

.Leti

= k.

If N'=N, then go to next step.

Otherwise, go to step 242

Step

Apply the Modified

Oropt procedure to the

current tour.

Stop when no further improve-

ment can be found.


End of algorithm SCAO

This algorithm is same as SCCO except greatest angle

selection is used instead of cheapest selection, as in SCCO.


4.

SLACK
This heuristic was suggested by Professor Rosenthal.

It is

designed for

the case

when the

widths between

the

upper and
large.

lower bounds of

the time windows

are relatively

In this heuristic,

the SLACK

is the most important

concept.
of

The SLACK
by wnich an

(i)

can be defined as the maximum amount


at

time

arrival

node

can

be

delayed
a

without causing
The SLACK

upper bound to

ne violated for

node

currently on the tour.

function can be

defined as

recursive

function as fcllcws.
SLACK(L(i))
=

min

u(L(i))

- AEfiVT(L(i))
+

,
}

SLACK (L(i*1))
where
WAIT(L(i))
=
niax

WAlT(L(i))

{0, 1(1

(i))

AKEVT

{L (i)

The first element of this

recursive function is the


and arrival time at node

difference between the upper bound


L(i).
The

second one is the


ex
a

sum of next node's

SLACK and

waiting
ncde L

time

node

(i)

The

minimum of

these

two

elements is
(

possible

delay time of

the arrival

time at
all nodes

i)

without violating tne


in the current tour.

upper bound of

after L

(i)

43

The advantage of this recursive

function is that it
without calcuL (i)
-

is easy

to calculate a

possible delay time

lating new arrival times for all nodes after


rithm is summarized as follows.

Ihe algo-

Algorithm
Input

SLACK
:

Number of nodes,

and

co-ordinates of all

nodes, time windows for all nodes-

Output
Step
Step
1

Ordered list of tour, total travel time.


(Initialization)
Start at the depotLet
N

{depot}.

Sort the upper time windows.


U

u
1

,u
2

,.

.,u
n

s.t.

u < u <
1

..<u
n

2 i

Step
Step

3
4

Set k = argmin u

s-t. i N-N

Find a node L(ISTAR)


if such a node exists.
(The criteria

after waich
Go to step

node

can be inserted in the current sequence,


7.

by which we determine if an

insertion can be made are given below.)


Step
5
-

If there is no such place to insert node k,

then try to find


I (ISWAP)

node

L (ISWAP)
k

in the

current sequence such that


and I (ISWAP) has
a

can replace
good cnance of

being reinserted somewnere else.


Select ISWAP which has the largest time
window width among candidates for ISWAP.
If there is no candidate, then stop.
<

no feasible solution
(

Step

Do swap

add

to N,

and delete L(ISWAP)


),

from

and set k = L(ISWAP)

then update slack and arrival times.


Go to step 3.
44

Step

Select the node which results in the minimum

additional travel, i . e, the node minimizes the following quantity.


c(L(I),k)

which

Step
Step

Insert

+c<k,L(I+1)) - c(L(I) ,L(1+1)) . after L(ISTAfi) , and add k to N',

and update slack and arrival times.


9
.

If N=N, then stop.

Otherwise, go to step 3.
End of algorithm SLACK

This
u
1 ,

procedure
into
n

starts

with

sorting
using
a

an

array

u
2

,...,u

ascending
u

order

heapsort
k

[Ref.

20].

This

array

is used to

select a node
Then find

in

ascending order for insertion.


be violated,

Since the upper bound cannot


a

this step is performed.


k

node 1{I)

after which ncde

can be

inserted in the current sequence,


must be

if a such a ncde exists.

There are

two tests which


be inserted
k

administered to

determine if

can

arrival time at node


if k precedes L (1+

if

the after L(I). First, succeeds L(I), whicn is called


the upper bound u
.

TEST1 must not be greater than


1) ,

Second,
than

tnen the resulting delay in arrival at


,

1(1*1),

which is
)

called TEST2

must

not greater

SIACK(L (1+1)

We can calculate TESI1, TEST2 as follows.

TEST1 = Arrival time at node k if k succeed L(i).


= max

{AERVI(L

(i) ),

l(I(i))}

d (L (i)

c(L(i),k).

TEST2 = Delay in arrival at L(i+1)


=

if
+

precedes L(i+1).
1)
)

max

TEST1, 1

(k)

d (k

c (k, L (i+

45

If there exists more

than one node L(I)

after which
to the

node k

can be inserted,

we

select

L (I)

according

criterion of least additional travel timetravel time,

This additional

called TEST3, is given by


+

TEST3 = C(IjI),k)

c(k, 1(1 + 1))


k

C(1(I) ,L(H-1)
L(I)
,

When we

insert node
and SLACKS.

after
the

we

update the
we

arrival times
compute updated

In

updating process,
for the

values of
a

SLACK only

nodes whose
we
a

SLACK actually changes as


If there

result of the insertion.

is no place to

insert node
tries

k,

call

sunroutine
L (ISWAP)

called 'TSWAP'.
in

TSWAP

to find
k

node

the current seguence

such that

can replace

L (ISWAP)

and L(ISaAP)

has

good chance of being reinserted


a

somewhere else.
select ISWAP.

TSWAP uses TEST1 ans 1EST2 to find


a

candi-

date for ISWAE and then uses

largest time window width to


then we do the
try to insert

If there exists such a ISWAP,

swap and update


again.

SLACKS and arrival times and

C-

EXACT SOLUTION TECHIIQOES FOB HABD TIME WINDOHS


1.

Stare-S pa ce Rel axa tion Procedure


A

dynamic programming model of the time-constrained We TSP has been developed by Christof ides et al. [Ref. 5].
applied
within
a

their

approach

to

compute

bounding
=

information
{li,\}

branch and bound algorithm.

Consider the TSP defined on the graph


the time window constraints,

with

where
Let

is a set of all nodes


be the set of nodes
j.
} ,

of G, and

is a set of arcs.
-

R{j)

from which it is possible to go

directly to node
1; +

We can

initially set R(j)


earlist possible

dL

cr

>

uj

because
j

it is impossible to go directly from ncde i to node

if the
upper

arrival time at
j.

node

exceeds

the

time window cf node

46

Let f <S,j)

be

the duration of the


For

least time path


we can calcu-

starting at node

passing through every node of SSS'=N-{1}


j.

and finishing at node

given
j

and

late a minimua arrival time in node


T(S,j)
=.

as
.].

min

f (S-j,i)

d.+ c.

(3.1)

Then,
f (S,j)

=
=

T(S,j)
1.
oo

if

<

T(S,j)
<

<

U.

, ,

if
if

T <S,j) T (S,j)

1.
3

>u

with the initialization:


f <{j}
,

j)

c (1,

=1
=
oo

j)

if
if

1.
3

<
<

c
3

<

u
3

c C
1j

l'
U
j

if

>

In equation

(3.1)

the minimum arrival time in node


the set S can be described

through the nodes in


of three terms
:

passing as the sum


j

the first is the duration of the least time

path passing through


in node i,
i,

the nodes in the set

S-

{j}

and ending

the second is the time required to spend in node


i

and the third is the travel time from node


The
f (S,j)

to node j3

can be calculated for all subsets


j

of S

and for

all nodes

by

using equation

(3.1)

recursively.

Finally, the optimum solution can be calculated as

iS'

min

f<S , ,i)

d i

c
i1

].

Since

the

computer storage

requirements

increase

exponentially with the size of the problem, tnis method is limited to small problems. The total number of f(S,j), when
'n-l

contains

nodes,

is

k(

,,

I,

since f(S,j)

must be calcuS

lated for all subsets

S oi

S', and since each node in

oust

47

be

considered as
21 ].

possible
f (S, j)

end-node

j.

Therefore

the

storage requirement for


by [Eef
.

in a n node problem, is given

n-1

/n-1\

n-2
(n-1)
2
.

(3,2)

The storage requirements to solve

22 node problem

relaxing this limitation, Christofides et al. [fief5] proposed a state space relaxation procedure which is analogous to Lagrangean relaxation
exceed

22,020,096.

For

[Ref.
in such

22] in integer programming.

The state space associis relaxed


to the

ated with a
a a

given dynamic programming recursion

way

that the solution

relaxed recursion
a

provides

bound which could be embedded in


[

general branch

and bound method

Ref . 23].

We describe Christofides et ai's

method for doing this below.

Consider the dynamic programming formulation (3.1) The state variable in that formulation is (S, j) and the stage is the cardinality of S. Let g (S) be a mapping from the domain of (S,j) to some other vector space (g(S),j)*
,

let

H(g(S),j)

(g(s-j),i)

i e

(s-jn R(j))

(3.3)

Since we are

interested in lower bounds


in
(3.3)

to the TSP

with time constraints, H(g(S),j)

may be replaced by

any larger set that is easier to compute.


can be defined by the following equation:

Thus,

H(g(S),j)

H(g(S),j)

(g(S-j),i)
E (g (S
)

iE(g(S),j)

(3.4)

where

(S- j D E

j)

For calculating
(3.1)

the lower bound

of, the

problem,

equation

can be changed to the following equation:


T(q(S),j)
= min (g(S-:) ,i)
,

e H

(g(ST,:j)

[f (g(S-j) ,i) + d. + c. .] i JO

(3.5)

48

e E

min [f (g(S-j) ,i) (g(S), j)

d
i

(3-6)

ij

This gives us:

f<g(S),j)

= =

T(g(S),j), if 1
1
,
,

.<

T(g(S),j)
<
1
.

<

if T?g(S),j)

if T (g(S)

,j)

> u
3

With the initialization;


f (g(j)
,

j)

=
=

c
1

if

1. <

<

u.
D

D
,

1
j
oo

if .. if

c
c

JD, <
.

i
u
J
j

Ij
1j

>

Finally, tne optimum solution can be calculated as

iEE(g(S),1)

min

f (g (S

ri)

d i

]-

i1

The mapping can be selected frcm any separable function.

Christofides
g (S)

et

al.

used

the

following

mapping

function.
=

|S|.

(3.7)

Then eguation (3.6)


T(k,j)
= S|
>

becomes
[

min
i e E(k, j)
1.

f( k-1,i

(3.8)

ij

where

This gives us:


f (k,j)

l(k,j)
.

,
,
,

if

1<

T(k,j)
<
1 u
3
.

<

u
3

= 1
=
oo

if Tik,j)

if 1 (k, j)

>

With the initialization:


f

dr

j)

c(1,
1
j
oo

j)

if if
.

<
j

c
1

<
i
3

u
j

=
=

C C

<
.

1f

1j

>

49

Finally, the optimum solution can be calculated as


min
i E( |N|,
1)
[

f (JS

,i)

c
i

].

i1

2.

Additio nal Con ditio n


In

the previous section,

we discussed Christofides

et al.

state space relaxation procedure which provides a lcwer bound on the TSP by reducing a state space in dynamic programmingThis lower bound is effective in branch and

bound only if it

similar to the case in integer programming where effectiveness of the Lagrangean relaxation in producing bounds is relative to tne
is a tight hound.

This is

integer programming

formulation.
a

condition can be
al.

helpful to get

redundant state-space better boundFor this


A

purpose, an additional condition was used by Christofides et


to avoid loops formed
=

by three consecutive nodes [Ref.


be the duration of the

5].

This can be done in the following way.


let
k
|

let
tne

(k,j,1)

least

time

path from

initial

state to

state

(k,j)

without

let consecutive nodes. be the duration of the second least time path from f (k r j,2) state to state (k r j) without loops formed by the initial

loops

formed

by three

three consecutive nodes.


j

on

let p(k # j,m) be the predecessor of With the above the path corresponding to f(k,j,m).
recursion
1)

definition,
T(k,j,

(3.8)

becomes:
d.+ c.
i
J,

=. min .[ i e E(k,]J if

f(x-1,i,m)

(3.9)

13

where m =

1,

p(k-1 #

i,

1)*j

= 2,

otherwise.

This gives us:


f(k,j,1)
= =

I(k,j,1),
1.
3
,

if

1.

<

T(k,j,1)
<

<

u.

(3.10)

if Tik,j,1)
50

1.
3

00

if T(k,j,1)
can

> u
J

Recursion following way.


Let

for

f(k, j,2)

be

written

in

the

T(k,j,2) T(kj,2) = min


.

f(k-1,i,m)

c d

],

(3.11)

i e E

*p(k,3,1)

<Jc ,

jl

ID

where

= 1, = 2,

if p(k-1,i,1)*j

otherwise.
if
< T(k,j,2)
< >
<

This gives us:

f(k,j,2) = T(k,j,2),
= 1.
,
,

1.

u.

(3.12)

if
if

T^k # j,2)

1.
u
J
.

oo

T(k,j,2)

The initialization is

f(1,i,1)

= c(1,i) =
=
1 i
00

,
,

if

<

<

u i

(3.13)

1i
<

if

c
1i c

1
i u i

if

>

1i

and

f(1,i,2)

=o

(3.14)

Finally, the optimum solution can be calculated as


Bin
[

(|S'|
)

i e E(J N| ,1

ifD

].

(3.15)

i1

Since the additional condition


tion of a useful lower

can avoid considerain


i

bound r

we considered f (k-1 # i,2)


of.

recursion
the path

(3.9)

and (3.11) only when the predecessor

on

If we do not corresponding to f(k~1*i*1) is j. consider the second least time path in case of p (k- 1, i, 1) = j,

then f(g(S)

j)

dees not guarantee the lower bound of f(S,j).

For this example,

let's consider
A

node TSP with


D is

time constraints.

Node

is the starting node.

the and

time free node.

The lower bound of


is 11,

node

is 9,
C

the

upper

bound of node

tne lower bound of node

is

19,

51

tie

upper boucd

of node C is 21.

Suppose

service time at
optimal route for

each node is this problem.

zero.

Figure 3.7 shows an

Figure 3.7

Optimal Route of Four Nodes Problem.

From equation
f (1,B, 1) f

(3.13)
=

we can get:

10,

(1,C,1)
f

= =

19,

f (1 f D

1)

7.07

Now applying equation

(3.9)

recursively with i=1, for

k=2

we

can get:

52

f <2,B,1)

= =

oo

f <2,C, 1)
f (2,D,1)

19,

= 17.07

Similarly, for k=3


f <3,B, 1)
f f

(3,C,1)
(3,D,1)

=~

=~

We can
f

see easily that

f(3

D,1)

is

not a lower

bound of

(iB.C.L},Q).
3-

Branch and Bound Procedure


In

introduce brancn and bound enumeration which is used to eliminate subtours in the soluSince the tion of the state space relaxation procedurestate space relaxation procedure is a relaxation of the TSP
this

section

we

with

time constraints,

the solution
a

to

tne state

space

relaxation procedure provides

lower bound on

the optimal

value of the ISP with time constraints.

Any heuristic solu-

tion can provide an upper bound.

He denote some notation to

explain this algorithm as follows.


FL3D
=

The lower bound, which is tne optimal solut-

ion to the state space relaxation procedure,


on the

optimal

solution

to

the

TSP

with

time

constraints

given restrictions at the

current node.
Z

= =

Current upper bound.


Array which represent decision tree. It con-

STACK

tains arc lists which have the same head in

optimal tour to the state space relaxation


53

procedure given restrictions at the current


node.
c ij
]

Travel time matrix given restrictions at


the current node.

There are two types cf tree search. One is depthfirst search, the other is breadth-first search [Ref. 24].
Ke

used

depth-first

search

since

breadth- first

search

substantially more storage. Depth-first search simply means that when a separation is defined, one of the
reguired
nodes created by
be the next

the separation is immediately

selected to
the

subproblem,

and when

ncde

is fathomed,

enumeration always
live node.

backtracks to the most

recently created

One of the most important reguirements of any branch


and bound algorithm is tight

bounds.

The closer the bounds

are to
ated.

the

optimal solution,

the fewer nodes must be enumer-

We used the SCCO

heuristic,

which was described in

section B.2, as an initial upper bound.

The lower bound is

obtained from eguation (3.15). To save computing time we need


whether or not the branching
is greater than Z,

criterion to decide
If

should be continued.
the current

FLBD
node.

then the node is fathomed since explicit


not be

enumeration need
node
arc,

extended belcw
graph since
each
If

For branching we consider the arcs


in the

which have the same head


arc

directed
in

must have

different head
nodes of the
has the same

the TSP solution.

there is

no such

then that solution is a feasible solution.

After all

tree are fathomed,

feasible solution which


an exact

value as the upper bound is

solution

to the TSP with time window constraints.


The

following branch and bound

algorithm is used in

the programs written for exact solution-

54

Algorithm

Eranch and Bound Procedure


:

Input

Total travel time of heuristic, travel time.

Cutput: Ordered list of tour, total travel time.


Step
1 .

(Initialization)
Let Z = the optimal solution of SCCO.

STACK = empty.
[

c'.] =
c

c.

Step

Compute FLBD given restrictions defined by


[

].

If FLED

>

Z,

go to step 5.

tep 3

(Construct the tree)


Put all arc(i,j)

ij

which have the same head


arc,

in directed graph on STACK. If there is no such

save feasible route


5-

and update
Step 4
.

FL3D then go to step

Let travel time of arc(i,j)

which is in the top of STACK be infinite, then go to step 2.


(i.e.
,

'

Step

(Backtrack)
If STACK
=

empty, go tc step

7.

Step

If

travel time of arc(i,j)


arc
be

which is in the
then go to

top of STACK is finite, let travel time of

that

(1, j)

infinite,
= -)

step 2.

Otherwise, let travel time of that arc(i,j)


be original travel time of that arc(i,j)

(i.e., c* ij

and

remove tnat arc(i,j)


Go to step 5.
(i.e.
,

from top of the STACK.


c*
ij
=

c
ij

Step

(termination)
If there

is

feasible route,
=
Z.

then the

optimal travel time

Otherwise, there is no feasible solution.


End of algorithm Branch and Bound Procedure

55

We present the results

of our computational experi-

ence with the algorithms of this Chapter in Chapter V.

56

IV.

THE ISP WITH S OFT TIME ilJLQCi CONSTRAINTS

A.

INTBODUCTION
The second time-constrain ted TSP we consider is the case

in which both late and early


a

arrivals are allowed by paying


are allowed to be different
Tne penalty cost is calculated

penalty cost.

The penalties

for early and late arrivals.

as follows.

Upper penalty cost =


x

max
{

0,

upper penalty constant


)

arrival time - upper bound


[

]-

Lower penalty cost

max
x
(

0,

lower penalty constant


)

lower bound - arrival time

]-

In fact,

the upper penalty constant is greater than the

lower penalty

constant in most

cases.

Figure 4.1

may be

helpful to understand this case.

This approach

makes every problem feasible,

no matter

what the time windows are, i.e,


the hard time window case.

even if it is infeasible in

This reflects a practical point

of view, especially when it is possible to save a great deal

of

mileage

by

allowing

small

amount

of

time
cost

window
to be

violation.
In this Chapter,
we considered one unit of

the same as one unit of time.

In real world problems,

it is

possible to get a cost by multiplying traveling time by some constant.


We use the

notation

lp
k'

and

up
k

for

the

lower

and

upper penalty cost at node k.

57

lower penalty cost

upper--> pnalty cost

early arrival

lower bound
1

upper bound

arrival

late

->i

time window for city i

Figure
B-

4.

Diagram for Soft Tine Window Case.

HEURISTIC S010TIOB TECHSIQUES FOB SOFT TIME IINDOHS


1-

Nearest Neighbor
This heuristic is

except it
necessary.

similar to the hard time windows takes into account any penalty cost that might be

Algorithm
Input

nearest Neighbor
:

Number of nodes, x and

co-ordinates of all

nodes, time windows for all nodes.

Output
Step
1

Ordered list cf tour, tctal cost.


(Initialization)
Start at the depot.
Let i=depot r
N* =
{i}
,

cost =
i

0-

Step

Compute AftRVT tor all-ncdes kN-N'


AEEVT
k =

AEEVT
i

C i

ik
+

cost
k

cost
i

d
i

ik

58

Step 3

If ARRVT
Jc

< 1
Jc

then cost
k

cost
k

lp
k

If AERVT
k

> u
k

then cost = cost


k
)

up
k

Step

Nearest Neighbor Selection Select the node k N-N such that cost
(

is a minimum.
k =

I.e.
J

find
s.t.
j

argmin

cost

6N-N".

Step

Insertion Insert k after


(
)

i,

add

to

subtour N',

and

let i = k

Step
Step

If N' =

then go to next step.

Otherwise, go tc step 2.
7
.

Compute total cost, then stop.


Total cost = ccst
k
+

d + k

c
k,

depot

End of algorithm nearest Neighbor

This

solution was
been visited.

depot and
has not yet

moving to the

constructed by starting at the nearest neighboring customer that


The term "nearest" is modified in
travel time if

the sense tnat we

add a penalty cost to the

the time window for city i is violated.


2.

SCCO

This algorithm

is also designed

for the

case when

there is a
free nodes.

combination of tight time window


a

nodes and time

The strict observance of the upper bound in the

hard time windows is replaced by

penalty cost.

Algorithm
Input

SCCO
:

Number of nodes, x and


nodes,

co-ordinates of

aj.1

time windows for all nodes.

Output
Step
1

Ordered list of tour, total cost.


(Initialization)

59

Start at the depot.


let i=depot,
N' =
u
{i}
,

cost =
i

0.

Step 2.

Set k =

argmin

s.t.

jN-N*.

If k is time free node,

then set k = depot.

Step

Insert node

in the subtour N'.


k

Compute ARRVT
ARRVT
k

ARRVT
i

ik
'

Step

Insert time free node j N-N between nodes i and k by cheapest insertion and cheapest

selection
not exceed

same as CCCC)
u
k
.

until ARRVT
k

does

Step

Update cost
k

cost =
k

cost
i
,

d
i

ik
+

If ARRVT < 1 k k

then cost = cost


k

lp
k

k
+

If ARRVT >
k

u k

then cost = cost


k

up
"

Step

let i = k.
If N'
= N,

then go to next step.


to step 2.

Otherwise, go
Step
7
.

Apply tne Modified Oropt procedure to the

current tour.

Stop when no further improve-

ments can be found.


End of algorithm SCCO

This

procedure

is

also similar

to

the

cheapest

selection,
TSP,

cheapest insertion method


that the nodes
nodes.

fcr the unconstrained

except

with time windows

are treated
time

differently froc the time free

The nodes with

60

windows

are inserted

in

order

of increasing

upper

time

window bounds.

The time
the upper

free nodes

are inserted
the time

those nodes
for as

ty cheapest

selection and
bound of

between cheapest insertion,

long as

windows will

allow.

Modified Oropt is used to improve the solution. This procedure consider only those exchanges that would result in a node being inserted netween two other
In the end,
a

nodes in the current tour.


3.

SCAO

This algorithm is also designed


set which

for the time window and some

is composed of some

tight time windows

time free nodes.


algorithm
Input
;

SCAO
:

Number of nodes,

and

co-ordinates of all

nodes, time windows for all nodes.

Output
Step
1

Ordered list of tour, total cost.

(Initialization)
Start at the depot.
Let i=depot,
N
1

=
u

{i},

cost = 0.
i

Step 2.

Set k =

argmin

s.t.
3

jN-N*.

If k is time free node,

set k = depot.

Step

Insert node

in the subtour N'.


k

Compute AfiEVT
AEEVT
k

ABBVT
i

d
i

ik
j

Step

Insert time free node


i

N-N*

between nodes

and k by

angle
exceed

cheapest insertion ana greatest (same as CCAO) until AfiEVT does not
k u
.

b1

Step

Update cost
k

cost =
k

cost
i < 1 k
k
.

i
r

ik
+

If ARRVT

then cost = cost


k
k +

lp
k

If ARRVT > u
k
k

then cost = cost


k

up
k

Step

let i = k.
If N'
= N,

then go to next stop-

Otherwise, go to step 2.
Step
7
.

Apply the Modified-Orop t procedure to the

current tour.

Stop when no further improve-

ments can be found.


End of algorithn SCAO

This algorithm

is same
a

as SCCO

except

greatest

angle selection in stead of


C-

cheapest selection in SCCC.

EXACT SOIOTICM TECHHIQOES FOB SOFT IIME IINDOIS


1.

State-Space Relaxati on Procedure


In this section we describe a state space relaxation

procedure,
5
]#

which is adapted from Christofides et al.

[Ref.

for soft time windows. They only considered the TSP with

hard time windows and without time windows.


are as

The difterences by
a

follows.

Tne waiting cost is replaced

penalty

cost to be paid in the early arrival case.


allowed,

Late arrival is
So we have to

but

penalty cost has to be paid.

calculate tne duration and the penalty cost on each possible path to decide the least cost path in eacn stage. Be denote
the

penalty

ccst on

each

possible

path

as

PC

in

this

section

Consider the TSP defined on the graph G = {N,A} with soft tine window constraints. Let S* be a set of all nodes except starting node. Let S be a subset of S'. Let f (S, j)

62

be

the cost

of

the least

cost path
of S and
a

starting

at node

passing through
Let T(S,j)

every node
p(S

finishing at
of

be the total duration of

node j. path corresponding to


j

f(S,j).

Let

j)

be the predecessor

on

the

path

be the early arrival Let lp(t) corresponding to f <S, j) be the late arrival penalty penalty cost function and up (t)

cost function-

For

given

and j,

total

duration of a

path can be calculated as


T (S,j)
=
[

T(S-j,i)

d.

c.

].

(4.1)

where

p (S, j)

i-

Id eguation can be

(4.1)

total

duration of
S

the least the

cost path
j

passing through the nodes in the set

and ending in node

described as tne sum


of

of

three terms:
in node i,
i,

first is

total duration

the least

cost path passing

through the
the second is

nodes in the set S-{j} and ending


the time required to spend in node

and the third is the


path
may then

travel time from node i to node

j.

The dynamic programming


be

recursion
stated as

to determine

the

least cost

f(S,j)

min
i Sj

f(S-j,i)
*

PC

(4-2)

i
d
i < T
1

ij

where T1 =

2(S-;j,i)

].

ij
< < u
3

PC=0
=
=

,if
(1

1
,

lp

-T1)
)

if

up (Tl-u

if

11

> u

with the initialization:


f ({j}
r

j)

= C
1

if

<
j
)

C
1 ,

<

U
j

= c

lp
+

(1
(c

-c
-u

if

c c

< >

1
u

= c
.

up

if

Finally, the optimum solution can be calculated as


min
i e S
[

f(S , ,i)

d l

].

i1

63

computer storage requirements are increased exponentially with the size of the problem, this method is limited to small problems. For relaxing this a state space relaxation procedure can be used limitation, same as Chapter III. Consider the dynamic programming formulation <4.2) The state variable in that formulation is (S,j), and the stage is the cardinality of S. Let g (S) be a mapping from some other vector space (g(S),j). the domain cf (S,j) to
Since
the
Let:

H(g(S),j)

(g(S-j).i)

iS-j}
in
(4-3)

(4.3)

Since we are

interested in lower bounds


Thus,

to the TSP

with time constraints, H(g(S),j)

may be replaced by

any larger set that is easier to compute.


can be

H(g(S),j)

defined by the following equation;


H(g(S),j)
=
{

(g(S-j),i)
j)
.

iE(g(S),j)

(4. 4)

where

-j

E (g (S)

For calculatirg the


(4.1)

lower bound of the

problem,

recursion

can be changed to the following equation;


T (g(S)
,

j)

=
=

T(g(S-j) ,i>
i.

d.

c.

(4.5)

where

p(g(S) ,j)

Recursion (4.2)
f(g(S),j)

may be stated as

=. min [f (g(S-j) ,i)+ d.+ c. (g(S-^) ,i) H(g(Sf # 3) r lj

PC]
+

(4.6)

min

f (g(S-j) ,i)
+

d
i

c
ij

PC]

(4.7)

i e E (g(S), j)

where T

1(g(S-j)

,i)

c
<
<

],

PC =
= =

,if
lp
(1

1
,

< T

u
I
u

.-T1)
)

if if

T T

up (Tl-u
j

>

64

with the initialization:


f (g(j)
,

J)

= c

if
J 1j

1.
3

<
,
t )

< u.
3
3

= c

lp(l -c
1j
*
# up(c

if
if
.

<

1
J

= c

-a
J

1j

1j

1] s c > 1j

u
J

Finally, the optimum solution can be calculated as

ie(q(S),1i

min

f (g(S*

) #

i)

d i

].

i1

The mapping can be selected frcm any separable function. He used a mapping function
(3.7)
,

which is proposed by

Christofides et al.
(4. 5)

same as Chapter III.

Then equation

becomes:
T (|S
|

j)

=
=

T(|S|-1,i)
i

d.

+ c. lj
.

].

(4.8)

where

p(|S

|,

j)

Recursion (4.7)
f

may be stated as:


=

(|S|,j)
[

iE(JSj,j)
+

min

f (|S

|-1,i)+
+

d
i
],

PC

ij

('4.9)

where T1 =
PC =
=

I(JS|-1,i)
r

d i
< T

ij
1

if
,
,

<u
3

lp

(1

-T1)
)

If
if

T1
T
1

<

i
u

up (TT-u

>

with the initialization:


f

j)

c
c
1j
+

if
'

1
j

<
)

c
,

<

=
=

lp (1 -c
up (c

if

c
1j

-u

11
)

_ if

c
1j

<
v >

1
j

c
1j

u
j

1j

Finally, the optimum solution can be calculated as


air [ i e E(|N|,1)
f (|S
|

,i)

d
i

+ _c

].

i1

65

2.

Addi t ional Conditio n


In th previous section,

we discussed

state space

relaxation procedure
al,[Ref.
TSP

which is adapted from window

Christofides et
The

5].

That procedure provides a lower bound on the


tine

with soft

constraints.
Eef .
5].

additional

condition to avoid

loops formed y three


[

consecutive nodes
This can be done

was used to get a better bound


in the following way.

Let k =

|S|.

Let

f (k,j,1)

be the cost of the least


to state
(k r j)

cost path
the cost

from the
of the

initial state

without
be

loops formed by three consecutive


second least
p(k,j,m)

nodes.

Let

f (k,j,2)

cost path

from the

initial
on the

state to state
tive nodes.

(k, j)

without loops formed ny three consecube the predecessor of


j

Let

path corresponding to f{k,j,m).

With the above definition,

eguation (4.8)
1<k r j,m')

becomes:
=

T(k-1,i,m)
i

d
i

c
ij

],

m'=1,2

(<*.10)

where

p (k, j

a'

if

p(k-1,i,1)*j

otherwise.

With the initialization:


T (1* j, 1)
=

and

T(1,j,2)

Recursion for
way.

f (k,j,1)

can be calculated in

the following

Let:
T' (k # j,m)
=

T(K-1,i,m)

c i

],

m = 1,2

ij

This gives us:


66

f(k,j,1)
where PC =
=

is E(k,jJ
,

min
if

rf(k-1,i,m)
< T (k,j,m)

c
i

PC

(4.11)

13
u.
<

1.

<

lp<l -TMk.J)). = up <T^(k,j,m)-u ), if


3
1
,
,

^ TM^^m?
T(k, j,m)

1.
u
J

>

in

if p (k-1,i,1) *j

otherwise.

With the initialization:

f(1,i,1)

=c
1i

if
1

1
i

<
)

<

u
1

(4-12)
<
1. 1 u i

1i
,

= c = c

lp(l.-c

if
if

1i 1i

1i + up(c -u ) 1i i

1i
,

>

1i

Recursion for f (k,j,2) can be written in the following way:


f(k,j,2)
where PC
= = i

min

(k-1,i,m)

PC]

eEtt.jt i#P(k,3,1)
,

ij
u <

(4.13)

if
(k,

<

T (k,j,m) if

<

=
=

lp(l -T

j?m)),
),
3

T(k,

3,111?

1.
u.
3

up|TMk,i,m)-u
1

if

T(k, j,m)

>

= =

r
,

if p (k-1,i

,1) #j

otherwise-

With the initialization:

f(1,i,2) =

00

(4.14)

Finally, the optimum solution can be calculated as


fflin

[f
|

(|S'lriJ)

+ i

c
i
1

].

(4.15)

i e E

, 1

Since the additional condition can avoid consideration of a useful lower bound, we considered (k-1,i,2) in
recursion (4.11)
on the
path

and

(4.13)

only when the predecessor of


is 3.

corresponding to f(k-1 r i,1)


j)

If we do not
p (k1

consider the second least cost path in case of


then f (g(S)
,

,1,

1)

=j

does not guarantee the lower bound of

f (S,j)

67

For this

example, Node
A

let's consider

node
9,

TSP
D

with

time constraints. time free node.


the upper bound

is the starting node-

is the
and

The lower bound of

node

B is

the upper

tound of node B is 11, the lower bound of node C is 19,


of node C is 21.

Suppose

service time at

each node is 2ero, lp(t)=t, and up(t) = 5t.


an optimal route for this problem.
(<J.

Figure 3.7 shows


and

Frcm equation (4.10)

12)

we
f

can get:
= 10,

(1,B,1) = 10, T(1,B,1)


= =

p(1,B,1)

A;

f(1,C,1)
f (1/D,1)

19,

T(1,C,1) = 14.14, p(1,C,1)


=

= A;
=
A.

7.07, T(1,D,1)

7.07, p(1,D,1)
(4.11)

New applying
i=1,

equation (4.10)
we can get:
= min [ 94.7, i e {C , D}
=

and

recursively with

for k=
f

(2,3,1)

29-84

29.84,

T (2,3, 1)

14.14, p(2,B,1)

= D.

Similarly,
f (2,

C,1)

=
=

19,

T(2,C,1) = 14.14,
1(2, D,1)
=

p(2,C,1)

= D;

f(2,D,1)
Fcr
k

17.07,

17.07, p(2,D,1)

= 3.

= 3,
f

(3,B,1)

=
i

min [ e{c}

94.7

= 94.7,
=

T(3,B,1) = 24.14, p(3,B,1)


Similarly,
f

C.

(3,C,1)

39.84, 1(3, C,1)


.

24.14, p(3,C,1)

= D;

f (3,0, 1)

= <*>

We can
f

see

easily tnat
.

f(3,D,1)

is

not a lower

bound of

({B,C,C} ,D)

68

3-

Branc h

a nd

Bound Procedu re
and bound procedure used to

We used the same branch

eliminate subtours in the solution of the state space relaxation procedure


in Chapter
III.C. 3.

The

SCCO heuristic,

which was described in section B.2,


upper bcund,
(4.15).
We present the results
of

was used as an initial

and the lower bound was obtained from equation


our computational experi-

ence

with
-

the

algorithms

of this

Chapter

in

the

next

Chapter

69

V-

COMPUTATIONAL EXPERIENCE

A.

TES1 PROBIEHS

Four sets of test data are used in this thesislest problem number [1] is taken from Sedgewick Ref.19: p. 309], The other problems, numbered [2], are from [3] and [4], Appendix 9.1 cf Eilon et al- s text [BefThese test 21].
'

problems are shown in Appendices A,B,C respectively.


published problems
We

These
but

contain node
time

and depot
for

locations,
test

they do not include time windows.

constructed

windows

problems

Ml#f2],3] by first using the CCAO


strained ISP.

Heuristic on the unconThe idea for gener[fief-

lime windows were then placed aoout each node

such that the CCAO route was feasible.

ating time windows in this way

comes from Baker

25],

who used the unconstrained Nearest Neighbor heuristic as his

starting point instead of CCAO.


The time

window widths were set to varying sizes ranging

from

to

14.

Seme of tne time windows


overlapped.

were rairly tight

This is in contrast to Baker's work, where all the time windows have width egual 2 units.
while others
The last

[3],

problem number [4] is the same as test prctlea except that the time windows were constructed from a

Nearest

Neighbor solution
problem,
as

to

the
[

unconstrained
Eef .
25].

traveling
Figure
5.1

salesman

in Baker

displays the CCAO


for the test

solution for test problem

[3] and figure

5.2 illustrates the unconstrained

Nearest Neighbor solution

problem 4], We found a small error in Baker's TSP solution for the Nearest Neighbor [Ret. 25], in that the

nearest node

from node 16

is node

17,

not node

13.

The

resulting cost is actually higher, it is 312.09,

not 310.22.

70

Figure 5.1
Each of

Oncon strained Solution Obtained by CCAO.


the four sets of

test data was used

to create

four test problems.

The separate

instances differed in the


chosen to
50%.

percentages of time
be in effect-

window constraints that were


100%, 90%,

The four cases were

75%, and

We refer to this percentage as the "time-window percentage".

random number generator was used to decide which nodes would have time windows. Test problems for the time window
A

constrained

TSJ?

are

shown in Appendices G

through

V.

The

71

>-

190

Figure 5.2

Uncon strained Solution Obtained by Nearest Neighbor Heuristic.


can be varied depending
2

penalty cost factors


problems.
He

on real world

used

and

5
a

as the

lower and

upper penalty

cost factor.

Also we set th

service time at each node to


the time windows.

to make it easy to construct

The computational results are presented in Tables II and


III.

The

figures reported represent

results of

our test

runs for each test case.

72

B.

COflFOTATIOHAl RESULTS
1-

Hard lime Windows


As

heuristic

ncted in section 3, often cannot solve the


of the

the

Nearest

Neighbor
the

problem,

because

arrival time
upper bound.
the Nearest

nearest node

However in test
Neighbor are the

frequently violates the problem [4], the results of


the unconstrained

same as in

because this problem itself was constructed Nearest Neighbor heuristic.


problem,

by a

Generally, the SCCO and SCAO heuristic can be easily

applied
to

to the
if

hard time
time

window TSP.
on

According to
the optimal

our

experiments,

the time window width becomes large relative

the travel

between nodes

uncon-

strained TSP route,


can be seen

then the lower percentage

time window

problems become more difficult to satisfy.


in test problem
[ 1

This phenomenon

That is because the other


of the

nodes in the optimal route for the unconstrained TSP problem


could b
bound.

inserted without
The S1ACK

causing violation

upper

heuristic takes

slightly more

time than
in

the other heuristics.

It achieved lower accuracy

test

and [3] in the 50 The exact algorithm can find the


[ 1
]

problems

percent time

window case.

lems,

exact answer in most probbut when there are fewer windows in effect, it takes
It cannot

mere computation time.

solve the 50 percent time


180 seconds.

window

problems [2] and [4] within

73

>

& u
+J

rr*oro
r-CNP-;*

0-1

oooo

f-vOlD unao

u
(0

It*
.O\0
1

Or- ro

ro

or^o^ oo^oro on^or^cN vo^O OOOvO ooun t(N tit


*

X
+i

oooo OOO 0'.Q


t 1

*^ti3-^

roc^oro
1

cf^^-^S'ij-^r^'

CTcri(N

ooro
f
1
I

W o u

CT>CFiCr>

aocoaoco
r^r-r-r'-

CNCNCO

^O^OvOvO
^O
-0
-0

<-&\Q\Q

r-r-O
rorOPO
*

^^^t
r-tNr-rsj

cncncncn

3
a. L>

op*po
CN'-*-t-

rvjfNrCNCNCNCN oooo OOOO oooo oooo

t^ono oinon rooor* m

to
aj

o a H
at

P
W O U

000^3vO^O^OO
1 I

^=t=rco

OOOO
1

mmr^rr,

^^^a>
1

^^a-CN
t

ooroo

a>cn<"\icri

*4 10

^-ovoa>
<x>vDv^r r*r>rr)ro tT T f"

C^CTiTKTl .OX) -Q^O


^cj-^r^j.

cocoaor^ pxp^p^ro cncncnpo


rorpro

cNcsjcors)

-t-orornrnor)

W 53 H
Eh

O
o U
aj

oooo OOOO OOOO OOOO


u^r^in OOOCTi
rororom

P^ropTO Ot- t-

T T T

p*poo
T t T ^"

Q
aj

-p
03

OOOO

=t=r^^
a-^-;*^

CT>cTi(N0>

oo^oo
i

en

\>^0^>CT\

03

O U

0">CP>CriCT>

v^vO^O
-OvO vOCO

^o^o^o
cr^t^-3-

CO CO CO CO

rs)(NCO(N

W h a H H
H O < <o et h H* Q 10 H
pa OS

p^p^p^pCNCNCNCN

r-r-OrPOmfOrn
Op*or)ro
CNr- * t

3
a,

p*p-oo
i-Or"(N

r^onps
r-< t r-

rooo^o
*"(%)

OOOO OOOO oooo oooo


3-^-HrLD

ft*

u u
to

coooroco

-p

W o u M O
CJ*

OOOCTi
*^0>sOO

OOOO it
'<OvOi)vO

^^3-^ ^^a-^t

0>u>cNcn

oomo tit
r-i-romroro p*r>oo
CNCNCO CM

O^CXiCTiCri

CO CO CO CO
r**rr^r-

>o^o^ocn vO^O^OCO
rp

^^^

ea-

CNCNCNCN
rT-

J
EH
<3

M
M

toja

(UX
o> a;

O Oi U
-P
U)

rn

oo

-o

O
rn
1

|
| |

oooo

I
1 1 1

r-T r- fN)

m o
ro
P3

(d-H

ZS
<U

H O Qj
O u

o u

sdvO vO'O r-pN


t i

^ ^
t

CTicncncTi

oooo
t

w 3 ao

w
c

cr>cr>

*o
Ln^rcNco
r-r-f-

o\ V0
^r

co p* <N
cr>tn^-

CNCNCNCN T ~t T

a o
T3

moo o") ro
CTiLn*(Mr-r r-

3 CU

Ht3t H<4-l

ha H W H MS
U>
l '1

r OMT)rCN-< *-

(Nr-t-r-

o u
cy

s
vO^O^OvX) t r-" r-

aO
a

3HH

t3
I

J 3

CMCNcNCN CN'^CN'N

CNCMCNCM INCN^NCN

CNCNCNCN CN'NCNCN

cu -

a
H
-P

H
X)

Q)

o M

fm

r
harf

,>*

CN

00
V~J

=r

Oj

74

2.

Soft lime Windows


All of the methods tested for soft time windows were

able to find some answer

to every problem within reasonable

computing time,
algorithm.

except for

two instances
In general,

with the

exact

Witn the Nearest Neighbor heuristic,

the quality

of solution is not desirable.

the lower time


As

window percentage problems have


in the hard window case,

lower solution quality.

on test problem [4], the results of

the

Nearest Neighbor

heuristic
because

coincide

with the

uncon-

strained TSP
constructed bj

heuristic,
a

this problem

itself was

Nearest Neighbor. hard time window problem,

As in the

SCCO and SCAO


one problem
50

generally find

an optimal solution

except for

witn 50 percent time windows.


165.544 respectively.
the two

In test problem [1] with

percent time windows, the SCCO and SCAO values were 215.686,
The exact

algorithm could not solve

test problems with windows within 50 percent time 180 seconds. The reason is that the solutions of the state
have many

space relaxations

subtours and

it takes

long

time to eliminate these subtours.

With both hard


time windows.

and soft time windows,


the fewer

the results

are sensitive tc the percentage,

width and position of the


time windows

In most problems,

there are, the lower the accuracy of the heuristics.

75

a
Oh
4-

fno^oo ^^r-^
t *

coocn
COr- t t-t-^o

ooor*
C7NOCn\0
T-fSJ*- , t 1 If"
p-r>>r>p*
1

OOOct
coco T-r CO

CJ

OOrO
r
*

ivO

tin

W
-p
CO

rd tt

^r^^r^
SO O'nO
t

oooo ooo CNCNCN O

cnoo^
i

Cncft'-

CO coco CO

0OC0CN

^^=r^
t
#

OOP0

o u

vO'-O'-O '^C-O'sO -0

CTi<7CTN

co co oo ao

CNCN00

'.0-0 gj

^^^r

r*rr-r* CNCNCNCN

r-r-O

CO CO CO

3
CO EC cu

pr*-r<o

r-*roroo

as
2*

o Q H

u o <5 u CO
+J
CO

(Nc\)N^oooo oooo oooo oooo

*-Ot-CN

Or-*-CN

f>-OM^ 0<-r-0
r^r-r-rro PO^OCO

ocoor*

^^^^ ooo^
^Q'-GsOlD ^\Q\ais~>

ovo^c^cn CNCNCNCN

<?><J\r-<j\

M
so

o u

oooo
^^^^

J*^-:*^
CO co coco

OOCOO

COOOCNOO

cncr^OCTi

CNCN0OCN

H
EH

^O^O^O^Q
f

^O^OvOvO

r-r-p-r* (NCNCNCN

T-T-O*-

co ro CO 00

H
Em

O H W h as H Eh w CM O
CO
fc-t

CO

o Cu u u o
CO

r-r^t^ rr

>

CN oooo OOOO oooo OOOO

00*-r-

r-oooo
Or\j<

r*ror-

r-OCsjO

P^OP^po OCN-r-

QOOCO
+J
co

d-cr=r^o

CTiCTNCTicn

r^rr-r*
co co co no

O^CTi'

- CTi

^DvflvO^O

OOOO
till
v

CNCNCNCN

^^ir^t
t t
i

CO 00 CN CO

ooroo
#

o u

vO^O^OLD ^O^O^O r"


CN

en en en o> \O .0^OvO

CO 00 coco rrrr^

CNCNC0CN

^^^ar- CNCNCN

t Or
CO co co co CO CO

CNCNCNCN

6-1

B
CO cq
->

a
H O
CJ

p^r-i^ip*

0-*-'~

oooo rOr*r*co COOOO -OOCN t CN rOOOO OOoO OOOO OOOO


crscoo
v

C0.Q 0JJ3

ro

HOi

M
1 1

vOsOHDo ^ovomLn
r-r^r*C^C^COcn ^&\Qr~ OOCN

-o

r^or^^
cocn^*~
3-CJNrO^r

(TiCT>0>C^

CNCOCNCTt

O H
E-t

ZZ
01 0)

(0-H 0)01

o u
4J

ooo<OM** ij*) CTi

00 CO CO CO

oooo
CNCNCNCN * rT
CO CO CO CO

CQ

-J

^^3-cncn ~ =tcno> T

OOct^^O P**-c*o CNvOO^T

a,

o
CO

a cu * O u

3 ao

u
t

3d>

HT3-r H<4-| Ci

^
w

m^fNco ^"r-*

t-o>._o^-

(MT-T-T-

'CTiLnrcN

i-omt(N* vi

a o o
0) CO

M3
U>
(. 'J

cD
rj

ZO

34-1

C3
c3

^^OsCO r
*

t-t

CNCNCNCN CNCNCNCN

CNCNCNCN INCNINCN

CNCNCNCN CNCNCNCN
CO

a>

a
H
/-** *-*

H
J*

CD

r^s

r*

o
Ul

r
*-<

o
Cu

CN
>mmi

CO
<-w

^r
Wtf

a,

76

VI. COHCLOSIOMS AMD RECOMMENDATIONS

exact thesis has presented some heuristics and salesman problem cf traveling algorithms for the solution We considered two different with tine window constraints. hard time windows and kinds of time window constraints :

This

soft

time

windows.

Hard time

windows
a

are
cost.

inviolable,

whereas soft windows may be violated at

For both

hard time windows

and soft time


SCCC and SCAO,
TSP

windows,

we

developed some new

heuristics,
Stewart's

which are

modifications
[Ref.
we developed

of

unconstrained
We

heuristics
an

6] CCCO and CCAO.

Also for the hard time window only,


heuristic.
also developed
using

the SLACK

exact algorithm

for both hard

and soft window

state

space relaxation dynamic programming and branch and bound as

proposed by Chr istof ides et al. [Ref.


ately small
was also

.5].

The procedures were shown to be effective on some moder-

sized problems.
time

Nearest
and

Neighbor heuristic
solve the
low

developed,
with hard

cut it was

often unable to
it

problem
is

windows,

found very

guaiity solutions with soft

time windows.

Tnis experience
who not

consistent with tne findings of ethers [Ref. 7] determined that the Nearest Neighbor heuristic does
perform well en the unconstrained TSP.

The SCCO and SCAO are generally effective en most of the


small sized problems we tested,
which less than half the

except for tne problems in Further solve these

nodes have time windowsto satisfactorily

research is
problems.

needed in order

Another problem difficulty

that may reguire more

research is dealing with wider time windows. The SLACK heuristic which is used only with hard time windows is slightly slower than the otner heuristics.

77

Particularly,
The exact

in the lower time window percentage problems,

the accuracy becomes lower.

algorithm succeeded in solving 1 4 of the 16 but it was too slow to use in test problems to optimality, This most of the lower time window percentage problems.
algorithm's performance also depends upon the quality of the
upper
but

bound
a

which

is

obtained
for at

from

the

heuristic.

Additional research

is needed

to reduce

computation time,
some

working

program

least

problems

has

resulted from this effort.

78

TEST PROBLEM

1 ]

Dode

node

3
11

9
1

10 16 15
13

13 14
2

2
3 4

12
13

6
4

3
11
4

14
16

16

6
7

8 6
7 9

!I

12

10

8 9

4
7 5

10

14

Depot cc-ordinates

(12,10)
[fief.

problem source

Sedgewick

19]

79

APPEBDIX B
TEST PROBLEM [2]

node

node
12
I

2 95

272

26 7

242

301

258
260
|

13 14 15
16

259

265
233

3
4

309
217

315
329
318 329 267

274 267
267

252 252
224 213
192

5
6 7
8 9

218 282

1"?

242
230
249

249
262

18

19
|

275

268
267

20
21

303 208
32b

201

10
11

256

217
181

265

257

22

Depot co-ordinates
problem source
:

(326,181)

Eilon et al- [Ref. 21]

80

APPEBDI X C

TEST PEOBLEH [3]

node

no de

151

264
261

12

156

217
214

159 130
128

13
14

129 146
164
141

3
4 5 6

254

208 208
206
193
193

252

15

163
146
161

247
246

16

17
|

147

7
8
9

242

18

164
129

142
163

239
23b 232
231

19

189

20
21
|

155

185
182

10
11

148

139

128

22

145

215

Depot co-ordinates
problem source
:

(145,215)
[fief.

Eilon et al.

21],

81

APPJjJII D TEST PROBLEM 5]

nodi 3

node

node
27
|

nod e
40
41

37
49 52

52 49

14

12

42
16

30 48
43 67
58 48
|

15
|

36
52
27
17 13

28
29

10

17

3
4

64
26

16
I

41
23 33
13
|

42
|

21
5

10

20

17
18

30
I

58 27

43
|

64
15
10

5
6

40
21
17 31

30
47
63

31

37 69 38 46
46 10
61 33
j

44 45

30

19

32

30
32 25

7
8
9

20
1

57

58

33
34

46 47

39 32
55
28

62

21

62
42
16
8

42
57
57
I

52
51 42

33
21
41

22

35
36

62 63
63 69
|

48
49

25

10
11

23
24
I

48
56

52
38
68
|

37
38

32 22
45 35

50

37

12

31
5

32
25

25
26

13

27

39

59 15

Depot co-ordinates (30,40) problem source Ellon et al. fRef .21].


:

82

APPE8DIX E
TEST PBOBLEH 6]

nod e x

node
20
21

node
39

nod fe 40
70

22

22

66

14
13 13

30 60

58
|

60
64
4

2 3

36
21

26
45

44
26
11

40
41
|

30 50
12 17 15 14
16
19

59
60

22

64

4
5

45
55

35
20
34 50
|

23
24

28

42
I

61

36

43
64

43
44

62
1

30
20
15

20 30
5

6
~J

33
50

25
I

17
41

21 48

63
64

2b

46
34
16

45
46
47

5C 30
51 42
5C
15
|

55 26
40 55

45
59

21
|

55
35
52
43

65

50
57

70

28

66
1

72
42
33
4
3
.

10
11

66

29
|

26

48 49
50
I

48 21
12 38

67 68
69

45
38
50

65
51

30
31

26

12

35

31

7b
53 29
|

15 56

13

62
62

35

32

22
26

51

29 39

70
1

66
59 35 27

14
15
16
17

57
34
36
I

33
34

52
53

54 38
55 57

71

62
21

50
55
54

40
50
10
15

72
|

60
24 20 37

35
3b
I

54
|

b7 41
10 70
6
J

73
74

33
9

44
56
48

55
56
57

40

18
19

37
38

60

25

75

40

62

47

66

65 27

Depot co-ordinates

(40,40)
[ fief

problem source

Eilon et al.

.21].

83

APPEHDIX F
TEST PfiOBLEM FOE THE SCCO

node

time window
l(i)

node

time window

Mi)
I

Ki)
11

u(i)

9
1

10

13

10
2

17
9

11
b

12
13

16
15

14
2

3
4

27 37
-

36
45
-

4
5 8
6

3
15
11 4

14

13
2

16
12

13

5
6
7
8
9

15
I

12

23
43
49

16

12

10

35
42

4
7
c

58
53

68
64

10

14

Depot co-ordinates

(12,10)

problem source
node locations
time
:
:

Sedgevick [Ref.
see Chapter V-

15]

windows

84

iEBDIX G TEST PROBLEM [ 1-1

node

time window
1(1) u(i)

node

time window
l(i)
u(i)

9
1

25
46

32 53
36
45

11

10
16

13 14
2

10
2

17
9

2
3
4
5 6

11
6
4

12 13
14

27
37
18

15

51
5

59
13

3 15
11

13
2

16 12

5
8

28 23
43

15
16

22
-

30
-

14

12

10

7 8
9

6 7
9

4 4 7
5

35
42

49

58
53

68
64

14

Depot co-ordinates
CL =
2. 0,

(12,10)

CU

=5.0
:

problem source
node locations
tine

Sedgewick [Ref.
see Chapter
V.

19]

windows

85

APPEBDIX H
TEST PROBLEM
[

1-2

node

time window
l(i>
u(i)

node

time window

Ki)
11
I

a(i)

9
1

25
46

32
53
36
I

10

13
14
2

10
2

17
9

2
3 4 5

11
6
4

12 13
14

16 15 13
2

27
-

51
5

59
13

3 15
11
4

16

18 14

28
23
43

15
16

12
10

22
-

30

6
7

8
6
7 9

12

35
42

3 9

49

7
5

58 53

68
64

10

14

Depot co-ordinates
CL =
2. 0,

(12,10)

CU

=5.0
:
:

problem source
node locations
time

Sedgewicx [Ref
see Chapter
V.

19]

windows

86

APPEJDIX I
TEST PEOBLEH
[

1-3

node

time window

node
|

time window

Mi)
1

u(i)

Mi)
11

Mi)
17
9

9
1

10
16

13

10
2

2 3
4

11
6 4 5

46
27 37
18

53
36 45 28
23 43
49
, | I

12 13

14
2

15

51
5

59
13

3
15
11

14

13
2

16

5 6 7
8

15
16

\2

22
-

20

8
b 7

14

12

10

35
42
-

4
7 5

10

14

Depot co-ordinates
CL =
2- 0,

(12,10)

CU

=5.0
:

problem source node locations


tine

Sedgewick [Ref.
see Chapter V.

19]

windows

87

AP EHDIX J

TEST PEOBXEH

1-4

node

time window

node
J

time window

KiJ
|

u(i)

Mi)
I

a(i)

9
1

25
46
-

32 53
-

11

10
16

13
14
2

10
-

17

2 3
4 5 6
7

11

12

8 4
5 8

13
14

15
13
2

51
5

59
13

3
15
11

37
-

45
-

16

15
I

12
10

14
-

23
-

16

12

6 7

4 4

8
9

7 5

58
-

68
-

10

14

Depot Co-ordinates
CL =
2- 0,

(12,10)

CU

=5.0
:

Problem Source
node locations
time windows

Sedgewick (Ref.
see Chapter
V.

19]

88

APPEBDII K
TEST PROBLEM [2-1
]

node

tide window

node
I

time window

Mi) Mi)
|

Mi)
12
I

u(i)

295
301

27 2

125 110 102

135
118
1

267 259 315


3 29

242
265
2 33

170
193

179

2 3 4 5
6

25 8

13 14

202
67 89

309

260 274
27 8
26 7

10

57
81

217
218

242
23 9
141

250
246
149
2 86

15 16
17

252 252 224


213
192

318
3 29
2 67

90
40

98 49

282
242

7
8 9

249
26 2

279
261

18 19

382 404
432
3 23

393
413

230
249

271

275
3 03

26 8 26 7

206

215
208
193
I

20
21

201 217
181

442

10
11

256

200
183

208
3 26

332
-

265

257

22

Depot co-ordinates
CL =
2. 0,

(326,181)

CU

=5.0
:

problem source
node locations
Eilon et al[

Ref . 21],

time

windows

see Chapter V.

89

APPEBDIX L
TEST PROBLEM [2-2]

node

time window

node
I

time window

KiJ Mi)
1

Ki)
12
I

a (i

295
301

112
25 8
26

125

135
118
1

267 259

242

170
-

179

110
102

13

265
233
2 52
2 52

3
4

309

10

14

315
3 29

57
81

67
89
98 49

217
218

27 4
27 8

242
23
9

250
246
149

15
16
17
18

5
6

318

90
40

282
242 230

267 249
26 2
26 8

141

329
2 67

224
213
192

7
8
9

279
261

286
271

382

393
413

19

275
3G3

404
432
-

249

206
20

215
208
193

20
I

201

442

10
11

25b
265

26 7

21

208
326

217
181

25 7

183

22

Depot co-ordinates
CL
=
2- 0,

(326,181)

CU

=5.0
:

problem source
node locations
tiioe

Eilon et al. [Ref.


see Chapter
V
.

21]

windows

90

APPEBDIX H TEST PBOBIEH [2-3

node

time window

node

time window

Ki)
|

u<i)

Ki)
|

u(ij

295
301

272
25 3
26

125
-

135
-

12
I

267 259
315
3 29

242

170
-

179

2 3
4 5

13
1<*

265
2 33

309

57
81

67

217
218

274 278
26 7
24
9

242
239
141
-

250
246
149
-

15
16
17 18
19

252
252 224
213
192
201

89
95
49

318
3 29

90 40

6
7

282
242

267 275
3 03

382
-

393

3
9

230
249

262
26 8

206

215
208
193

20
I

432
3 23

442

10
11

256

26 7

200
183

21

208

217
181

332
-

265

257

22

326

Depot co-ordinates
CL =
2- 0,

(326,181)

CU = 5.0

problem source
node locations
:

Eilon et al. [Ref. 21].


see Chapter V.

tine

windows

91

APP E8DIX N

TEST PBOBLEM [2-4

node

time window

node

time window

Ki)
j

u(i)

Mi)
12
I

u(i)

295
301

272
25 8 26

267

2 42

170
-

179

110
-

18

13
14

259
315

265
233 252

3
4
5
o

309

57
-

67

<

217
218 282
242
23

274
27 8
26 7

242
239
141

2 50

15

329

246
149
I

16
17
18 19

318

252
224
213
192

90
-

98

329
267 275
3 03

7
8
9

24

279
-

286
-

26 2

404
432
-

413 442

249
256

268
26 7

206
-

215
I

20
21

201

10

208
3

217
181

265

257

22

26

Depot cc-orainates
CL =
2. Q,

(326, 181)

CU

=5.0
:

problem source
node locations
ilon et al.
[

Ref .

21

].

time

windows

see Chapter V.

92

APP E1DIX

TEST PROBLEM [3-1

node

time window

node
I

time window

Ki) Mi)
1

1U)
12
I

a(i)

151

264
261

196

204
193
2 25

156 129 146 164


141

217

105

118

2 3
4 5

159
130

185

13

214
2 08

259
2

271
10

254

217

14
I

128 163 146


161

252
247
24 6

222
174

234
185 154
173
|

15
16
17

208
206
193
193
139

92
10

105
19

6 7
8 9

142
166 131

147

54
79
30

68
89

242
23 9

18
19

164
129

142
163

142
165
131
2
|

38
75 53

236
23 2
23
1

159

20
I

155
139

185

67
^0
-

10
11

148
128

123

21

182

242

53

22

145

215

Depot co-ordinates
CL =
2.
,

(145,2 15)

CO = 5-0

problem source
ncde locations
:

Eilon et al. [Eef


see Chapter
V.

21].

time

windows

93

APPE1DIX P
TEST PROBLEH [3-2
]

node

time window
l(i)

node

time window

Mi)
2 04

Mi)
12
I-

a (i

151

264
26
1

196

156

217

105
-

118

2
3
4

159
130

185

193

13

129
146

214
2 08

254
25 2

217

225
234
185
| I

14

10

128
163 146
161

222
174 142
166
131

15
16

164
141

208 206
193
193

92
10

105
19

247
24 b

6 7
8
9

154
173

17
|

147
164
129

54
79

68
89

242
23 9

18
19

142

142

189

30
67 -

38
75

163
148
128

236
23 2
23
1

159

165
131

20
I

155
139 145

185
182

10
11

123

21

242

253

22

215

Depot co-ordinates
CL =

(145,2 15)

2.0

CU = 5-0

problem source
node locations
titte
:

Eilon et al [Ret. 21].


see Chapter V.

windows

94

TEST PROBLEM 3-3

node

time window
i(i) u(i)
I

node

time window

Mi)
12
I

u(i)

151

264
26
1

196
-

204
-

156
129

217

105
-

118

2 3
4
5

159
130

13

214
2 08

254
25 2

14
I

146
164
141

10

123
163

222
174

234
185 154
|

15
16 17
18
19

208
2 06

92
10

105
19

247
24 6

146
161

142
-

147 164
129

193
193
189

54
79
-

68
89

7
8
9

242
23 9

142
163

159

236
23 2 23
1

165
131

20
I

155
139

185 182

67
40
-

75 53

10
11

148
128

123

21

242

253

22

145

215

Depot co-ordinates
CL =
2.
,

(145,2 15)

CU = 5.0

problem source
code locations
;

Eilon et al. [Ref. 21],


see Chapter V.

time

windows

95

APPE8DIX H
TEST PEOBLBH 3-4
]

node

time window

node

time window

Mi)
1

u(i)

Mi)
12
I

u(i)

151

264
26
1
.

156

217
214
2 08

105
-

113

2
3

159
130

185
-

193
-

13 14

129

254
25 2
24
7

146 164
141

10

4
5

128 163

222
174 142

234
185 154
173
-

15
16

208 206
193
193

10

19

6
7

146
161

24 6

17
18
19

147 164
129

242
23 9

166
-

8
9

142
163

189
135

30
67
-

38
75

236
23 2

159
-

165
I

20
21

155

10
11

148
128

139
145

182

23

22

215

Depot co-ordinates
CL =
2.
,

(145,215)

CU = 5.0

problem source
node locations
;

Eilon et al. [Ret. 21],


see Chapter V.

time

windows

96

APPEBDIX

S
]

TEST PROBLEM 4-1

node

time window

node

time window

Mi)
I

u(i)

Mi)
12
I

Mi)
79

151

264
261

171

179
170

156
129

217
214
2 08

72

2
3

159
130

162
196

13

237
5

245
9

254

203

14

146
164
141

4 5 6 7 3
9

128 163 146


161

252
24 7

198

206
136
1
|

15
16

208
2 06

61

67
14

128
106

10

24 6

13
|

17

147 164
129

193

22

28
53

242
23
9

122
97

130 105
146
96
2
j

18
19

193 189

48
261 35

142
163

2b9
40
280

236

138
89

20
i

155 139 145

185
182

10
11

148
128

232
23
1

21

273
-

220

27

22

215

Depot co-ordinates
CL =
2.
r

(145,215)

CU = 5-0

problem source
node locations
;
:

Zilon et al
see Chapter

fief .

21],

time windows

V.

97

APP JJDIX T

TEST PfiOBlEH [4-2]

node

X.

time window
l(i)

node
I

time window

u(i)

Mi)
12 13
1

u(i)

151

264
26
1

171

179
170

56

217
214
2 08

72
5

79
9

2
3
4

159
130

162
196

129 146 164


141

254
252
24 7

203
2 06

14

128
163

196 128
10b

15
16
|

208
2 06

61 10

57
14

5
t>

136
113

146
161

24 6

17
I

147
164

193

22

28
53

7
8

242
23 9

122
97

130 105

18
19

193
189

48
261
35
-

142

129

269
40
-

163
148
128

236

138

146
96
2 27

20
I

155
139

185 182

10
11

232
231

69

21

220

22

145

215

Depot co-ordinates
CL =

(145,2 15)

2.0,

CU = 5.0

problem source
node locations
time
:

Eilon et al. [3ef.


see Chapter
V.

21],

windows

98

APPEMDIX
TEST PBOBLEH [4-3
]

node

time window

node

time window

Mi)
1

u(i)

Ki)
I

u(i)

151

26a
261

171
-

179
-

12 13
1"

156

217 214
2 08

72
-

79

2
3 4 5 6
7

159
130

129
146 164
141

254
252
247

128

198
128

206
136
1

15
16

208

61
10

67
14

163
146
161

206
193

246
242
23 9

106 -

13

j I

17 18
1^

147 164
129

22
48
-

28
53

193
189

8
9

142

138

163
148
128

236
23 2

146
96
2 27

20
I

155 139 145

185

35

40

10
11

89

21

182

273
-

280

231

220

22

215

Depot cc-ordinates
CL =

(145,2 15)

2.0,

CO = 5.0

problem source
node locations
:

Eilon et al. [Ref. 21]


see Chapter V.

time windows

99

1ZEJJDIX TEST PROBLEM [4-4

node

time window

node

time window

Mi)
1

tt'(i)

Mi)
12
13
14
I

u(i)

151

264
261 254
25 2

156

217
214
2 08

2
3
4

159
130

1c2
-

170
2 06

129
146

72 5

79
9

128
163
146
161

198
128
106
122
-

15 16 17

164
141

208
2 06

10
-

14

247
24 6

136

6 1
3
9

113
130 146
I
| I

147

193

242
23 9

18
19

164
129

193
189

142

261

269
40
-

163
148
128

236
23 2

138
-

20
21

155
139 145

185
182

35
-

10
11

231

22

215

Depot co-ordinates
CL =
2. 0,

(1<*5 r

215)

CU = 5.0

problem source
node locations
:
:

Eilon et al. [fief. 21].


see Chapter V.

time

windows

100

, ,

LIST OF BEFEBENCES
Garey, t. B. , Graham, R- L., and Johnson, P. S- , "Soma 8th ACM Symp. NP-complete Geometric Problems," Proce. M Theory of Computin g , 1976.

1.

2.

Lenstra, J. K. and fiinnooy Kan, A. H. G. , " Complexity of Vehicle Routing and Scheduling Problems," Net wor ks,
Vol.
11, pp.

221-227,

1S81-

3.

"A Dynamic programming Solution to N. , Psaraftis, H. the Single Vehicle Many-to-Many Immediate Request Dial-A-Eide Problem," T rans po rtati on Science, Vol. 14,
No.

2,

pp

13

0-154,

198 "01

4.

Exact Algorithm for the E. K., "Ad Baker, Traveling Salesman Problem", Time-Constrained 938-945, No. Operations Res ear ch. Vol. 31, 5, pp. "Sept ember -October, 198 3.

5.

Mingozzi, and loth, N., A., P., "State-Space Relaxation Procedures for the Computation of Bounds to Routing Problems," Net wor ks, Vol. 11, Mo.
Chris to fides,
2, pp.

145-164,

1981.

6.

Stewart, Stochasti DisserYaE University


Golden, B., Bodin, L. , Doyle, T. , and Stewart, ft. R. " Approximate Traveling Operations Salesman Problem." " Research. Vol. 28, pp. 694-711, 1980.
and Lewis, P. M. D. J., Stearns, R. E. , "Approximate Algorithms for the Traveling Salesman Problem," SI AM Journa l on Computing, V.6, pp. 563-581,
Rosenjcr antz,
1
,

7.

8.

977.

9.

Clarke, G. and Wright, S. W. , " Scheduling of Vehicles from A Central Depot to A Number of Delivery Points," Q pe ra ti ens R esear ch, Vol. 12, pp. 568-581, 1964
Wiorkowski, J. and McElvain, K. , "A Rapid Heuristic Algorithm for the Approximate Solution of the Traveling Salesman Problem," Trans Research, Vol. 9, pp. 181-185 , 1975Or, I-,
a nd

10.

11.

Traveling S ale s man- Ty pe Combinatorial Problems Their Relation to tfie logistics of" BTooa "Ban Icing "Thesis, PH.D, Dept.ot Industrial "Engineering and Management Sciences, Northwestern University, 1976.
101

12.

Stewart, K. R. , "A Computationally Efficient Heuristic the Traveling salesman Problem," for Procee ding s Thirteenth Annu al M eet ing of Sout hea stern TTHs", Hyrtle BeacE, 37c. f pp. 75 rE5 # 1977.

13.

Norback, J. P. and Love, R. F., "Heuristic for the Hamiltonian Path Problem in Euclidean Two Space," Operati ons Resear ch V. 30 , pp. 363-368, 1979.
Hardgrave. H. W. and Nemhaoser,G. 1. , "On The Relatin Between The Traveling Salesman Problem and The Longes Path problem," Operations Research, V.10, pp. 647-657,
1962.

14.

15.

Lin,

" S- , Computer Solutions of Salesman Problem," Bell Syst. Tech, 2 245-2269, 196 5.

the
J.

Traveling
44,

pp.

16.

B. , Lin, S. and Kernighan, "An Effective Heuristic Algorithm the Traveling Salesman Problem," for O peration s Research, Vol. 21, pp. 498-51b, 1973.

17.

Golden, B.
Vol.
7,

"A Statistical

Approach to TSP," Networks,

pp.

209-225, 1977.

18.

Norback, J. P. and Love, R. F., "Geometric Approaches tc Solving The Traveling Salesman Problem," Manag emen t Science, V. 23, pp. 1208-1223, 1977.

19.

Sedgewick, Callfonia,

E.

, Algorithas, Addiscn-w esley, Menlo Park, pp.~30T=33 2~1 ^83

20.

V., Hopcroft t J. and Ullman, J. D. , Ihe Aho, A. E. Addison Design and Analysis or Comp_uter Algorithms, Wesley, denio parlt, Calilonia, pp. H7-9"2, Jun,197 4.

21.

TT3=TT9~T971^
22.

Watson-Gandy, and Christofides, Eilon, S. , C. Dist ribution Management, Griffin Press, London,

N., pp.

Fisher, M. L. , " The Lagrangean Relaxation ilethod for Management Solving Integer Programming Problems," Science, Vol. 27, No 1, pp. 1-17, Jan, 1981.
P rog ramming,

23.

Gariinkel,

R.

3yp3FC7~1^72.
24.

S. and N em ha user, G. John wiley, New York, pp.

L., In teger 103-152, pp.

Christofides,
York,
pp.

N.

390-395,

Academic Press, Graph Theory, pp7"^3^-2BT7 1975-

New

25.

Baker, K., "Vehicle Routing with Time Window E. Constraints," The Logistics and Transportation Review, Vol. 18, number 4, pp71"83-4TJl, T9"8"2.

102

INITIAL DISTBIBOTION LIST


No.
1.

Copies
2

Defense Technical Information Center Cameron Station Alexandria, VA 22304-6145


Library, Code 0142 Naval Postgraduate School Monterey, CA 93943-5 100

2.

3.

Department Chairman, Code 55 Department of Operations Research Naval Postgraduate School Monterey, CA 93943-5 100

4.

Professor Richard E. Rosenthal Code 55R1 Naval Postgraduate School Department of Operations Research Monterey, CA 93943-5100
Prcfesser James K. Hartman Code 5 5H1 Department of Operations Research Naval Postgraduate School Monterey, CA 93943-5 100
Litrary, P.O.Box 77 Gong Neung Dong, Dobong-ku Seoul 13C-09, Korea

5-

6.

7.

Litrary Air Force Academy Dae Bang Dong, Dongjak-ku Seoul 15 1-01, Korea
Air Force library
C.Box 6 Sin Dae Eang Dong, Dongjak-ku Seoul 15 1-01, Korea
P.

8.

9.

Chun, Bock Jin Sun Hwa 1 Dong, Chung-Ku Dae Jeon, Choong Nam 300-00, Korea
Major. 38^-01
Major. Lee, Sang He on 24 8-16, 19 long 2 Ban Kaneung-1 Dong. Eui jeongtu-si Kycungki 130-30, Seoul Korea

10.

11.

Chow Kay Cheong Apt Block 291A Jurong East Street 2 1 12-583, Singapore (0140)

12.

Seaside, CA 93955

Major. Win, Byung Ho 1066 Hamilton Ave. #B

103

21618b
Thesis C47843
c.

Chun

Algorithms and

216185
Chun

Algorithms and heuristics for timewindow-constrained traveling salesman problems.

n wy

90

356

3036

216185
Thesis C47843
c.l

Chun

Algorithms and heuristics for timewindow-constrained traveling salesman problems.

thesC47843

Algorithms and heuristics

for

time-wndo

3 2768 000 64765 5


DUDLEY KNOX LIBRARY

You might also like