You are on page 1of 175

Combinatorial Optimization

Dr. R.A. Sitters (SBE) - coordinator


Prof. dr. J.A.S. Gromicho (SBE)

meetings outside the course times are on appointment only


Rene Sitters

Dr. René Sitters is the coordinator of the


Combinatorial Optimization course.

Specialized in Combinatorial Optimization


problems: algorithms and complexity.

Winner of:
• Gijs de Leve prize (for the best PhD
dissertation on the mathematics of OR)
• Veni Innovational Research NWO

Combinatorial Optimization 05-02-2018 2


Joaquim Gromicho

Professor of Applied Optimization in Operations


Research at the SEB.

Scientific & education Officer at ORTEC, an


international company specialized in mathematical
optimization.

Editor in Chief of STAtOR, the ‘glossy’ of the


Netherlands Society of Statistics and Operations
Research.

Member of the coordinating committee of the EURO


working group on the Practice of OR.

Combinatorial Optimization 05-02-2018 3


Today’s course
Exposition of the structure of the course and the grading.
All this, and more, can be found on the course manual (on canvas).
Description of the bibliography.
Introduction to combinatorial optimization with the focus on theorems, proofs
and algorithms.
Today’s course considers only graph problems, therefore a gentle introduction
to graphs is also given.
Important learning objectives:
• Proving optimality of algorithms and analyzing their computational effort.
• Understanding that small differences in related algorithms influence their
computational effort.
• Understand that implementation details influence computational effort.

05-02-2018 Combinatorial Optimization 4


First part: theory

05-02-2018 Combinatorial Optimization 5


Second part: hands on case

05-02-2018 Combinatorial Optimization 6


Course bibliography: lectures
Course notes by René Sitters. These are all relevant! Will be
available per week.

Book ‘Algorithms’ by Dasgupta, Papadimitriou, and Vazirani


ISBN-13: 978-0073523408, last free version can be found on
Blackboard

Book ‘Scheduling Theory, Algorithms, and Systems’ by


Pinedo, ISBN-13: 978-0130281388, instructions how to
access online via the VU library on blackboard.

Course notes on routing by Daniele Vigo and Joaquim


Gromicho. Will be available toward the end of the first block.
05-02-2018 Combinatorial Optimization 7
The main book

Algorithms

Sanjoy Dasgupta, Christos Papadimitriou &


Umesh Vazirani

McGraw-Hill Education
336 pages
September 13, 2006
ISBN: 0073523402

Price: usually less than € 50


Was long time free online, still easy to find

Combinatorial Optimization 05-02-2018 8


Optional: a classical
book

Combinatorial Optimization: Algorithms and


Complexity

Christos H. Papadimitriou
&
Kenneth Steiglitz

Dover Publications
512 pages
July 7, 1998
ISBN: 0486402584

Price: usually less than € 20

Combinatorial Optimization 05-02-2018 9


Period 4
Days theme lecture tutorial
Introduction, graphs,
February 5 and 9 paths and trees, flows Gromicho Sitters
Complexity classes and
February 12 and 17 reductions Gromicho Sitters
February 19 and 24 Approximation algorithms Sitters Sitters
February 26 and March 3 Scheduling Sitters Sitters
March 5 and 10 Dynamic Programming Gromicho Sitters
March 12 and 17 Routing Gromicho Sitters
March 19 Routing Gromicho (No tutorial)

The exam is at the end of period 4!


March 29, 12:00-14:45
The roster wrongly shows an exam on June 1
05-02-2018 Combinatorial Optimization 10
Block 5
Days theme lecturer type
April 3 Introducing the case Gromicho Lecture
Sitters
April 10 Q&A Gromicho Tutorial
April 17 Advise Gromicho Tutorial
April 24 Advise Sitters Tutorial
May 1 Advise Sitters Tutorial
May 9 Advise Gromicho Tutorial
May 15 Advise Gromicho Tutorial
May 22 Presentations and Gromicho Assessment
discussion Sitters

05-02-2018 Combinatorial Optimization 11


Grades: exam + case

Minimum 5 for each part.


The resit is only for the exam!
Case is compulsory!!!
05-02-2018 Combinatorial Optimization 12
Algorithms by Dasgupta et al
Chapter 0, Running time

Sections 4.1 - 4.4 Dijkstra's algorithm

Sections 5.1.1 - 5.1.3 Minimum Spanning Tree

Chapter 6 Dynamic Programming

Sections 7.1-7.4 Linear programming and Flows

Chapter 8 NP-completeness

Sections 9.2 - 9.3


05-02-2018 Combinatorial Optimization 13
Scheduling Theory, Algorithms, and Systems
by Pinedo
Chapters 1 until 5

05-02-2018 Combinatorial Optimization 14


Course bibliography: tutorials
The answers to all exercises will be provided.

Some extra material will be used and placed on


blackboard.

Tutorial exercises are part of the exam material.

Questions from the slides are intended to motivate


discussion and reflection.

05-02-2018 Combinatorial Optimization 15


Today’s course
Main source: notes om Graphs by René Sitters

Additionally, from the book Algorithms:


• Preliminaries from chapter 0
• Paths in graphs including Dijkstra in chapter 4

05-02-2018 Combinatorial Optimization 16


What is combinatorial optimization?
Combinatorial optimization is ‘just’ to choose the best
(optimum) from a finite number of alternatives

Seems easier than it really is… since finite does not mean
attainable

It is common to confuse combinatorial optimization with


integer programming, especially with binary variables

A stronger relation exists between combinatorial


optimization and combinatorial decision problems: does
a specific type of solution exist?
05-02-2018 Combinatorial Optimization 17
The focus of the course
Highlighting some areas such as graphs, networks,
routing, packing, scheduling, …
Identifying important problems in those areas
Distinguishing ‘easy’ from ‘difficult’ problems
Be able to prove that a ‘difficult’ problem is indeed
difficult
Presenting mathematical models for these problems
Studying the important properties of these models
Explaining solution procedures for each problem,
emphasizing on computational efficiency

05-02-2018 Combinatorial Optimization 18


How difficult is a combinatorial
optimization problem?
We will learn that combinatorial optimization problems
may differ much in the effort needed to solve them.

We will understand that there are mainly two types of


combinatorial optimization problems:
- Those that we can solve efficiently
- Those that are as hard to solve as the hardest
combinatorial optimization problems

05-02-2018 Combinatorial Optimization 19


Typical structure of a ‘topic’
A problem description
A proof of hardness in case the problem is indeed hard
Algorithms, including analysis of the effort
• Exact (optimal)
• Heuristic (sometimes approximation) if the problem
is hard

05-02-2018 Combinatorial Optimization 20


Effort of algorithms
Running time is not a property of the problem, is a
characteristic of algorithms and their implementation
details!

Effort is measured by counting ‘elementary operations’


as a function of the ‘input size’.

These concepts will be defined later in detail, important


for now is understanding that differences between
computers or languages should not matter.

We start with a simple example.


05-02-2018 Combinatorial Optimization 21
Computing Prefix Averages
Let us illustrate computational effort
analysis with two algorithms for 35
prefix averages X
30 A
The 𝑖 𝑡ℎ prefix average of an array 𝑿 25
is average of the first 𝒊 elements of
20
𝑿:
15
𝑿1 + 𝑿2 + … + 𝑿𝑖
𝑨𝒊 = 10
𝑖
5
Computing the array 𝑨 of prefix 0
averages of another array 𝑿 has 1 2 3 4 5 6 7
applications to financial analysis

05-02-2018 Combinatorial Optimization 22


Prefix Averages version 1
The following implementation computes prefix averages by applying the
definition

function A = PrefixAverages1( X )
% Input array X of n integers
% Output array A of prefix averages of X
% operations
n = length(X); % O(1)
A = zeros(1,n); % O(n)
for i=1:n % O(n)
s = 0; % O(n)
for j=1:i % O(1+2+3+ ... + n) <- highest
s = s + X(j); % O(1+2+3+ ... + n) <- highest
end
A(i) = s / i; % O(n)
end

05-02-2018 Combinatorial Optimization 23


Prefix Averages version 2
The following implementation computes prefix averages in less time
by keeping a running sum
function A = PrefixAverages2( X )
% Input array X of n integers
% Output array A of prefix averages of X
% operations

n = length(X); % O(1)
A = zeros(1,n); % O(n)
s = 0; % O(1)
for i=1:n % O(n) <- highest
s = s + X(i); % O(n) <- highest
A(i) = s / i; % O(n) <- highest
end

05-02-2018 Combinatorial Optimization 24


How does the effort change with 𝑛?
The running time of
PrefixAverages1 is
dominated by
1 + 2 + …+ 𝒏

The sum of the first 𝒏


𝒏 𝒏+1
integers is
𝟐

The running time of


PrefixAverages2 is
dominated by 𝑛

05-02-2018 Combinatorial Optimization 25


Order of
We say that a function 𝑓 is of order of a
function 𝑔 and write:

𝑓 𝑛 = 𝑂(𝑔 𝑛 )
if for each 𝑛 exceeding some number 𝑘 a
constant 𝑐 exists for which:

𝑓 𝑛 ≤ 𝑐𝑔(𝑛)
05-02-2018 Combinatorial Optimization 26
Prefix averages: conclusion
Understanding that PrefixAverages1 is 𝑂 𝑛2 and
PrefixAverages2 is 𝑂(𝑛) should become more or less
‘automatic’.

Concluding that PrefixAverages2 is much more efficient


and scales much better with the growth of the input
should then be immediate.

05-02-2018 Combinatorial Optimization 27


Effort is an implementation issue!
It is common to say

this problem can be solved in 𝑂(𝑔(𝑛))

for some function 𝑔, but it is not always obvious how to


achieve the specified effort.

As we just saw, a roughly described algorithm leaves so many


details to the implementation that the factual runtime may
change the nature of the function 𝑔!

We will see this again at the end of today’s course when


studying max flow algorithms.
05-02-2018 Combinatorial Optimization 28
What is this course all about?
Models
Theorems
Proofs
Methods
Algorithms
Implementations
Computational effort
Computational complexity

05-02-2018 Combinatorial Optimization 29


Start with ‘decision’ problems
Is this number prime?
Can I reach that location from this one using the
available roads?
Can I finish my tasks in less than a given amount of time?
Can I draw this graph without lifting the pen and without
retracing lines?

05-02-2018 Combinatorial Optimization 30


Let us illustrate a few things
We will now see how a theorem, and in particular its
proof, may relate to:
- Answering a decision problem: yes or no
- Understanding the effort of making the decision
- Leading to an algorithm to find an yes answer if it
exists
- Determning the effort of finding a solution

05-02-2018 Combinatorial Optimization 31


Do you know this puzzle?

05-02-2018 Combinatorial Optimization 32


Answer is yes

05-02-2018 Combinatorial Optimization 33


What if you get the following cases?

05-02-2018 Combinatorial Optimization 34


What if you get the following cases?

05-02-2018 Combinatorial Optimization 35


What if you get the following cases?

05-02-2018 Combinatorial Optimization 36


What if you get the following cases?

05-02-2018 Combinatorial Optimization 37


What if you get the following cases?

05-02-2018 Combinatorial Optimization 38


Let us go back in time…

05-02-2018 Combinatorial Optimization 39


The 7 bridges of Königsberg
in the XVIIIth century

05-02-2018 Combinatorial Optimization 40


A scheme: simplifies experimentation

05-02-2018 Combinatorial Optimization 41


A scheme: simplifies experimentation

05-02-2018 Combinatorial Optimization 42


Higher abstraction: a model!

05-02-2018 Combinatorial Optimization 43


Our first model: a graph (see notes!)
Some points: nodes or vertices.
Some lines connecting pairs of points: edges or arcs (in
case they can only be traversed in one way).
The number of edges at each node is called the degree
of that node.
In case of arcs then we distinguish in-degree and out-
degree.
Two nodes are connected if one can move from the first.
to the second node using edges or arcs and maybe
intermediate nodes.

05-02-2018 Combinatorial Optimization 44


The first graph theorem of Euler (1736)

A connected (multi)graph has


an Eulerian path if and only
if the number of nodes with
odd degree is 0 or 2.
May the number be 0 then
the path is a cycle: it starts
and finishes on the same
node.

05-02-2018 Combinatorial Optimization 45


Remarks
The theorem gives a certificate for a graph to be
Eulerian.

This certificate is very easy (and in fact efficient) to


check.

In itself, the certificate does not disclose nor requires an


actual solution to the problem.

05-02-2018 Combinatorial Optimization 46


05-02-2018 Combinatorial Optimization 47
05-02-2018 Combinatorial Optimization 48
05-02-2018 Combinatorial Optimization 49
4

05-02-2018 Combinatorial Optimization 50


4 4

05-02-2018 Combinatorial Optimization 51


4 4

3 3

05-02-2018 Combinatorial Optimization 52


4 4

3 3

05-02-2018 Combinatorial Optimization 53


4 4

3 3

05-02-2018 Combinatorial Optimization 54


4 4

3 3

05-02-2018 Combinatorial Optimization 55


4 4

3 3

05-02-2018 Combinatorial Optimization 56


4 4

3 3

05-02-2018 Combinatorial Optimization 57


A theorem must be proven

05-02-2018 Combinatorial Optimization 58


05-02-2018 Combinatorial Optimization 59
05-02-2018 Combinatorial Optimization 60
05-02-2018 Combinatorial Optimization 61
05-02-2018 Combinatorial Optimization 62
05-02-2018 Combinatorial Optimization 63
What did the proof give us?
Some magic: we can now say that it holds for all graphs
although we may not claim to have seen all graphs!

A present: an algorithm! Note that the algorithm


revealed by the (constructive) proof is not provided by
the theorem itself.

What do you think about the effort needed to apply this


algorithm?

05-02-2018 Combinatorial Optimization 64


Consider a connected (multi)graph with
𝑛 nodes and 𝑚 edges.
What is the effort required to check if
question this graph is Eulerian?
In case the graph is Eulerian: what is
the effort required to find an Eulerian
path or cycle?
And what about the seven bridges?

05-02-2018 Combinatorial Optimization 66


Not possible!

05-02-2018 Combinatorial Optimization 67


Option 1: remove at least one bridge

05-02-2018 Combinatorial Optimization 68


Happened in situ

05-02-2018 Combinatorial Optimization 69


Nowadays it is possible!

05-02-2018 Combinatorial Optimization 70


Option 2: add bridges!

05-02-2018 Combinatorial Optimization 71


note
Although we are still discussing decision problems we
just met an optimization problem!

This optimization problem minimizes the effort needed


to make a given graph become Eulerian.

You will see it again later and learn its name: a matching
problem.

Although these matching problems can be efficiently


solved on general graphs some graphs called bipartite
allow for an easier algorithm.
05-02-2018 Combinatorial Optimization 72
What else do we get form graphs?

05-02-2018 Combinatorial Optimization 73


Reality

05-02-2018 Combinatorial Optimization 74


Simplification

05-02-2018 Combinatorial Optimization 75


Abstraction

05-02-2018 Combinatorial Optimization 76


Roadmaps are large graphs

05-02-2018 Combinatorial Optimization 77


This is an optimization problem!
There are usually (many!) alternative ways to go from
our origin to destination.

We want to find the ‘best’ defined as fastest or shortest

Hopefully not by computing and comparing all the


possibilities!

05-02-2018 Combinatorial Optimization 78


can you estimate the number of
question alternative paths between two
nodes for some types of graphs?
Shortest path from A to C

B C
1
?
10
A
2 3 9 4 6
7

2
D E

05-02-2018 Combinatorial Optimization 80


Edsger Wybe Dijkstra (1930 –2002)

B C
1
?
10
A
2 3 9 4 6
7

2
D E

05-02-2018 Combinatorial Optimization 81


Edsger Wybe Dijkstra (1930 –2002)

B C
1
?
10
A
9
0 2 3 4 6
7

2
D E

05-02-2018 Combinatorial Optimization 82


Edsger Wybe Dijkstra (1930 –2002)

B C
1
10 ?
10
A
9
0 2 3 4 6
7

5
5
2
D E

05-02-2018 Combinatorial Optimization 83


Edsger Wybe Dijkstra (1930 –2002)

B C
1
10 ?
10
A
9
0 2 3 4 6
7

5
5
2
D E

05-02-2018 Combinatorial Optimization 84


Edsger Wybe Dijkstra (1930 –2002)

B C
1
8 14
10
A
9
0 2 3 4 6
7

5
5 7
2
D E

05-02-2018 Combinatorial Optimization 85


Edsger Wybe Dijkstra (1930 –2002)

B C
1
8 14
10
A
9
0 2 3 4 6
7

5
5 7
2
D E

05-02-2018 Combinatorial Optimization 86


Edsger Wybe Dijkstra (1930 –2002)

B C
1
8 13
10
A
9
0 2 3 4 6
7

5
5 7
2
D E

05-02-2018 Combinatorial Optimization 87


Edsger Wybe Dijkstra (1930 –2002)

B C
1
8 13
10
A
9
0 2 3 4 6
7

5
5 7
2
D E

05-02-2018 Combinatorial Optimization 88


Edsger Wybe Dijkstra (1930 –2002)

B C
1
8 9
10
A
9
0 2 3 4 6
7

5
5 7
2
D E

05-02-2018 Combinatorial Optimization 89


Edsger Wybe Dijkstra (1930 –2002)

B C
1
8 9
10
A
9
0 2 3 4 6
7

5
5 7
2
D E

05-02-2018 Combinatorial Optimization 90


Edsger Wybe Dijkstra (1930 –2002)

B C
1
8 9
10
A
9
0 2 3 4 6
7

5
5 7
2
D E

05-02-2018 Combinatorial Optimization 91


Can you list the assumptions one needs
to make about the given graph in order
for Dijkstra’s algorithm to guarantee
question finding the optimal solution?
Can you estimate the effort needed to
apply this algorithm on a graph with 𝑛
nodes and 𝑚 arcs?
notes
It is clearly important that on each round of the algorithm
(we will call these iterations) exactly one node becomes
definitely labeled.
Therefore the number of iterations does not exceed the
number of nodes.
We need now to estimate the effort of each iteration!
There are many answers depending on the data structures
used! Please pay attention to this!
The design of efficient algorithms is a challenge that goes
beyond the proof of correctness

05-02-2018 Combinatorial Optimization 93


Let us meet
the Greedy
paradigm
minimum spanning trees

05-02-2018

Combinatorial Optimization 94
Greedy algorithms
A greedy algorithm extends its wealth (the solution
being constructed) by taking the step with highest
immediate gain.

Despite being well-known that greedy algorithms are


often sub-optimal, today we see one case where they
are optimal!

05-02-2018 Combinatorial Optimization 95


Greedy can be optimal!
In fact, there is a whole collection of optimization
problems defined in clean mathematical terms for which
greedy is optimal: the matroids.

The most famous and easy to explain example is the so-


called Minimal Spanning Tree.

05-02-2018 Combinatorial Optimization 96


Problem: laying cables

Central office

05-02-2018 Combinatorial Optimization 97


Expensive: dedicated connection

Central office

05-02-2018 Combinatorial Optimization 98


A better solution

Central office

05-02-2018 Combinatorial Optimization 99


Spanning Trees
A spanning tree of a graph is just a subgraph that
contains all the vertices and is a tree.
A tree is a connected graph without cycles.
A graph may have many spanning trees.

Graph A Some Spanning Trees from Graph A

or or or

05-02-2018 Combinatorial Optimization 100


Complete Graph All 16 of its Spanning Trees

05-02-2018 Combinatorial Optimization 101


Can you give an upper bound on
question the number of spanning trees of a
given graph?
Minimum Spanning Trees
The Minimum Spanning Tree for a given graph is the
Spanning Tree of minimum cost for that graph.
Complete Graph Minimum Spanning Tree
7

2 2
5 3 3

1 1

05-02-2018 Combinatorial Optimization 103


notes
Finding the minimum spanning tree of a connected
undirected graph is very efficient.

A greedy algorithm (i.e. an algorithm expanding a partial


solution in the best way as perceived at that moment) is
optimal in this case.

Several different algorithms may have different


efficiencies though…

05-02-2018 Combinatorial Optimization 104


On the book Algorithms
First section in chapter 5 on Greedy algorithms

05-02-2018 Combinatorial Optimization 105


Algorithms for Minimum Spanning Tree

Kruskal's Algorithm

Prim's Algorithm

05-02-2018 Combinatorial Optimization 106


Kruskal's Algorithm
This algorithm creates a forest of trees. Initially the
forest consists of n single node trees (and no edges). At
each step, we add one edge (the cheapest one) so that it
joins two trees together. If it were to form a cycle, it
would simply link two nodes that were already part of a
single connected tree, so that this edge would not be
needed.

05-02-2018 Combinatorial Optimization 107


Kruskal's Algorithm
The steps are:

1. The forest is constructed - with each node in a separate tree.


2. The edges are placed in a priority queue.
3. Until we've added 𝑛 − 1 edges,
1. Extract the cheapest edge from the queue,
2. If it forms a cycle, reject it,
3. Else add it to the forest. Adding it to the forest will join two
trees together.

Every step will have joined two trees in the forest together, so that
at the end, there will only be one tree.

05-02-2018 Combinatorial Optimization 108


Given Graph

B 4 C
4
2 1

A 4 E
1 F

D 2 3
10
G 5

5 6 3

4
I
H

2 3
J

05-02-2018 Combinatorial Optimization 109


A 4 B A 1 D

B 4 C B 4 D

B 4 C B 10 J C 2 E
4
2 1
C 1 F D 5 H
A 4 E
1 F

2 D 6 J E 2 G
D 3
10
G 5
F 3 G F 5 I
5 6 3

4
I G 3 I G 4 J
H

2 3
J H 2 J I 3 J

05-02-2018 Combinatorial Optimization 110


Sort Edges A 1 D C 1 F

(in reality they are placed in a priority


queue - not sorted - but sorting them C 2 E E 2 G
makes the algorithm easier to visualize)

B 4 C H 2 J F 3 G
4
2 1
G 3 I I 3 J
A 4 E
1 F

2 A 4 B B 4 D
D 3
10
G 5
B 4 C G 4 J
5 6 3

4
I F 5 I D 5 H
H

2 3
J D 6 J B 10 J

05-02-2018 Combinatorial Optimization 111


Add Edge A 1 D C 1 F

C 2 E E 2 G

B 4 C H 2 J F 3 G
4
2 1
G 3 I I 3 J
A 4 E
1 F

2 A 4 B B 4 D
D 3
10
G 5
B 4 C G 4 J
5 6 3

4
I F 5 I D 5 H
H

2 3
J D 6 J B 10 J

05-02-2018 Combinatorial Optimization 112


Add Edge A 1 D C 1 F

C 2 E E 2 G

B 4 C H 2 J F 3 G
4
2 1
G 3 I I 3 J
A 4 E
1 F

2 A 4 B B 4 D
D 3
10
G 5
B 4 C G 4 J
5 6 3

4
I F 5 I D 5 H
H

2 3
J D 6 J B 10 J

05-02-2018 Combinatorial Optimization 113


Add Edge A 1 D C 1 F

C 2 E E 2 G

B 4 C H 2 J F 3 G
4
2 1
G 3 I I 3 J
A 4 E
1 F

2 A 4 B B 4 D
D 3
10
G 5
B 4 C G 4 J
5 6 3

4
I F 5 I D 5 H
H

2 3
J D 6 J B 10 J

05-02-2018 Combinatorial Optimization 114


Add Edge A 1 D C 1 F

C 2 E E 2 G

B 4 C H 2 J F 3 G
4
2 1
G 3 I I 3 J
A 4 E
1 F

2 A 4 B B 4 D
D 3
10
G 5
B 4 C G 4 J
5 6 3

4
I F 5 I D 5 H
H

2 3
J D 6 J B 10 J

05-02-2018 Combinatorial Optimization 115


Add Edge A 1 D C 1 F

C 2 E E 2 G

B 4 C H 2 J F 3 G
4
2 1
G 3 I I 3 J
A 4 E
1 F

2 A 4 B B 4 D
D 3
10
G 5
B 4 C G 4 J
5 6 3

4
I F 5 I D 5 H
H

2 3
J D 6 J B 10 J

05-02-2018 Combinatorial Optimization 116


Cycle A 1 D C 1 F

Don’t Add Edge


C 2 E E 2 G

B 4 C H 2 J F 3 G
4
2 1
G 3 I I 3 J
A 4 E
1 F

2 A 4 B B 4 D
D 3
10
G 5
B 4 C G 4 J
5 6 3

4
I F 5 I D 5 H
H

2 3
J D 6 J B 10 J

05-02-2018 Combinatorial Optimization 117


Add Edge A 1 D C 1 F

C 2 E E 2 G

B 4 C H 2 J F 3 G
4
2 1
G 3 I I 3 J
A 4 E
1 F

2 A 4 B B 4 D
D 3
10
G 5
B 4 C G 4 J
5 6 3

4
I F 5 I D 5 H
H

2 3
J D 6 J B 10 J

05-02-2018 Combinatorial Optimization 118


Add Edge A 1 D C 1 F

C 2 E E 2 G

B 4 C H 2 J F 3 G
4
2 1
G 3 I I 3 J
A 4 E
1 F

2 A 4 B B 4 D
D 3
10
G 5
B 4 C G 4 J
5 6 3

4
I F 5 I D 5 H
H

2 3
J D 6 J B 10 J

05-02-2018 Combinatorial Optimization 119


Add Edge A 1 D C 1 F

C 2 E E 2 G

B 4 C H 2 J F 3 G
4
2 1
G 3 I I 3 J
A 4 E
1 F

2 A 4 B B 4 D
D 3
10
G 5
B 4 C G 4 J
5 6 3

4
I F 5 I D 5 H
H

2 3
J D 6 J B 10 J

05-02-2018 Combinatorial Optimization 120


Cycle A 1 D C 1 F

Don’t Add Edge


C 2 E E 2 G

B 4 C H 2 J F 3 G
4
2 1
G 3 I I 3 J
A 4 E
1 F

2 A 4 B B 4 D
D 3
10
G 5
B 4 C G 4 J
5 6 3

4
I F 5 I D 5 H
H

2 3
J D 6 J B 10 J

05-02-2018 Combinatorial Optimization 121


Add Edge A 1 D C 1 F

C 2 E E 2 G

B 4 C H 2 J F 3 G
4
2 1
G 3 I I 3 J
A 4 E
1 F

2 A 4 B B 4 D
D 3
10
G 5
B 4 C G 4 J
5 6 3

4
I F 5 I D 5 H
H

2 3
J D 6 J B 10 J

05-02-2018 Combinatorial Optimization 122


Minimum Spanning Tree Given Graph

B 4 C 4
B C
4 4
2 1 2 1
A E A 4
1 F E F
1

D 2 2
D 3
10
G G 5

3 5 6 3

4
I I
H H
2 3 3
J 2 J

05-02-2018 Combinatorial Optimization 123


Can you estimate the running time of
Kruskal’s algorithm?

question Can you also elaborate on the


assumptions you make about graph
representation and data structures
used?
Prim's Algorithm
This algorithm starts with one node. It then, one by one,
adds a node that is unconnected to the new graph to the
new graph, each time selecting the node whose
connecting edge has the smallest weight out of the
available nodes’ connecting edges.

05-02-2018 Combinatorial Optimization 125


Prim's Algorithm
The steps are:

1. The new graph is constructed - with one node from the old graph.
2. While new graph has fewer than n nodes,
1. Find the node from the old graph with the smallest connecting edge to
the new graph,
2. Add it to the new graph

Every step will have joined one node, so that at the end we will have one graph
with all the nodes and it will be a minimum spanning tree of the original graph.

05-02-2018 Combinatorial Optimization 126


Given Graph

B 4 C
4
2 1

A 4 E
1 F

D 2 3
10
G 5

5 6 3

4
I
H

2 3
J

05-02-2018 Combinatorial Optimization 127


Old Graph New Graph

B 4 C
4 B 4 C
2 1 4
2 1
A 4 E
1 F A 4 E F
1
D 2 3
10 D 2 3
G 5 10
G 5
5 6 3
5 6 3
4
I 4
H I
3 H
2 J
2 3
J

05-02-2018 Combinatorial Optimization 128


Old Graph New Graph

B 4 C
4 B 4 C
2 1 4
2 1
A 4 E
1 F A 4 E F
1
D 2 3
10 D 2 3
G 5 10
G 5
5 6 3
5 6 3
4
I 4
H I
3 H
2 J
2 3
J

05-02-2018 Combinatorial Optimization 129


Old Graph New Graph

B 4 C
4 B 4 C
2 1 4
2 1
A 4 E
1 F A 4 E F
1
D 2 3
10 D 2 3
G 5 10
G 5
5 6 3
5 6 3
4
I 4
H I
3 H
2 J
2 3
J

05-02-2018 Combinatorial Optimization 130


Old Graph New Graph

B 4 C
4 B 4 C
2 1 4
2 1
A 4 E
1 F A 4 E F
1
D 2 3
10 D 2 3
G 5 10
G 5
5 6 3
5 6 3
4
I 4
H I
3 H
2 J
2 3
J

05-02-2018 Combinatorial Optimization 131


Old Graph New Graph

B 4 C
4 B 4 C
2 1 4
2 1
A 4 E
1 F A 4 E F
1
D 2 3
10 D 2 3
G 5 10
G 5
5 6 3
5 6 3
4
I 4
H I
3 H
2 J
2 3
J

05-02-2018 Combinatorial Optimization 132


Old Graph New Graph

B 4 C
4 B 4 C
2 1 4
2 1
A 4 E
1 F A 4 E F
1
D 2 3
10 D 2 3
G 5 10
G 5
5 6 3
5 6 3
4
I 4
H I
3 H
2 J
2 3
J

05-02-2018 Combinatorial Optimization 133


Old Graph New Graph

B 4 C
4 B 4 C
2 1 4
2 1
A 4 E
1 F A 4 E F
1
D 2 3
10 D 2 3
G 5 10
G 5
5 6 3
5 6 3
4
I 4
H I
3 H
2 J
2 3
J

05-02-2018 Combinatorial Optimization 134


Old Graph New Graph

B 4 C
4 B 4 C
2 1 4
2 1
A 4 E
1 F A 4 E F
1
D 2 3
10 D 2 3
G 5 10
G 5
5 6 3
5 6 3
4
I 4
H I
3 H
2 J
2 3
J

05-02-2018 Combinatorial Optimization 135


Old Graph New Graph

B 4 C
4 B 4 C
2 1 4
2 1
A 4 E
1 F A 4 E F
1
D 2 3
10 D 2 3
G 5 10
G 5
5 6 3
5 6 3
4
I 4
H I
3 H
2 J
2 3
J

05-02-2018 Combinatorial Optimization 136


Old Graph New Graph

B 4 C
4 B 4 C
2 1 4
2 1
A 4 E
1 F A 4 E F
1
D 2 3
10 D 2 3
G 5 10
G 5
5 6 3
5 6 3
4
I 4
H I
3 H
2 J
2 3
J

05-02-2018 Combinatorial Optimization 137


Given Graph Minimum Spanning Tree

B 4 C
4 B 4 C
2 1 4
2 1
A 4 E
1 F A E F
1
D 2 3
10 D 2
G 5
G
5 6 3
3
4
I
H I
3 H
2 J
2 3
J

05-02-2018 Combinatorial Optimization 138


Can you estimate the running time of
Prim’s algorithm?

question Can you also elaborate on the


assumptions you make about graph
representation and data structures
used?
maximum
flow
Ford-Fulkerson and
extensions

05-02-2018

Combinatorial Optimization 140


Flow problems
Ford-Fulkerson method for maximum flow (MaxFlow).

Shortest augmenting path algorithm for MaxFlow.

Some generalizations of the MaxFlow problem.

The minimum cost flow problem.

05-02-2018 Combinatorial Optimization 141


Max flow
Given a network directed from a source vertex to a
target vertex with a maximum capacity on each arc
determine the highest value of a flow from the source to
the target which respects the capacities and is preserved
at each intermediate vertex.

05-02-2018 Combinatorial Optimization 142


Method of Ford and Fulkerson
Start with a flow of zero.
At each iteration consider the residual graph composed of
• The original arcs with capacity equal to the difference between the original
capacity and the flow on the arc (the residual capacity) if this value is
positive, otherwise the arc is eliminated.
• Arcs opposite to the original ones with capacity equal to the flow passing
on the corresponding arc if a (positive) flow passes on the arc.
Determine a path from source to target on the residual graph, if it exists, and
update the flow with the minimum capacity along the path (increase flow for
original arcs, decrease for opposite arcs). Update the residual graph. May such
a path not exist, terminate, the flow is maximum.
Following
https://en.wikipedia.org/wiki/Ford%E2%80%93Fulkerson_algorithm
we call this a method instead of an algorithm because it lacks detailing an
important step: determining a positive flow.

05-02-2018 Combinatorial Optimization 143


Ford-Fulkerson example
2 4 4
capacity
𝐷 = (𝑉, 𝐴)
10 2 8 6 10

s 10 3 9 5 10 t

05-02-2018 Combinatorial Optimization 144


Ford-Fulkerson example
0
2 4
flow
4
capacity
𝐷 = (𝑉, 𝐴) 0 0 0
10 2 0 8 6 0 10

0 0 0
s 10 3 9 5 10 t

Flow value = 0

05-02-2018 Combinatorial Optimization 145


Ford-Fulkerson example
0
2 4
flow
4
capacity
𝐷 = (𝑉, 𝐴) 8 X0 X0 8 0
10 2 0 8 6 0 10

0 0 8 X
0
s 10 3 9 5 10 t

Flow value = 0

Residual graph
2 4 4
𝐷𝑓 = (𝑉, 𝐴𝑓) residual capacity

10 2 8 6 10

s 10 3 9 5 10 t

05-02-2018 Combinatorial Optimization 146


Ford-Fulkerson example
0
2 4 4

𝐷 = (𝑉, 𝐴) 10 X8 8 0
10 2 X
0 8 6 0 10
2
0 X0 2 10 X
8
s 10 3 9 5 10 t

Flow value = 8

Residual graph
2 4 4
𝐷𝑓 = (𝑉, 𝐴𝑓)
8
2 2 8 6 10

s 10 3 9 5 2 t

8
05-02-2018 Combinatorial Optimization 147
Ford-Fulkerson example
0
2 4 4

𝐷 = (𝑉, 𝐴) 10 8 X0 6
10 2 2 8 6 X0 10
6
X0 6 X2 8 10
s 10 3 9 5 10 t

Flow value = 10

Residual graph
2 4 4
𝐷𝑓 = (𝑉, 𝐴𝑓)

10 2 8 6 10

s 10 3 7 5 10 t

2
05-02-2018 Combinatorial Optimization 148
Ford-Fulkerson example
X0 2
2 4 4

𝐷 = (𝑉, 𝐴) 10 8 X6 8
10 2 2X 8 6 6 10
0
X6 8 8 10
s 10 3 9 5 10 t

Flow value = 16

Residual graph
2 4 4
𝐷𝑓 = (𝑉, 𝐴𝑓)
6
10 2 8 6 4

s 4 3 1 5 10 t
6 8
05-02-2018 Combinatorial Optimization 149
Ford-Fulkerson example
X2 3
2 4 4

𝐷 = (𝑉, 𝐴) 10 8 7
X X8 9
10 2 0 8 6 6 10

X8 9 X8 9 10
s 10 3 9 5 10 t

Flow value = 18

Residual graph 2
2 2 4
𝐷𝑓 = (𝑉, 𝐴𝑓)
8
10 2 8 6 2

s 2 3 1 5 10 t
8 8
05-02-2018 Combinatorial Optimization 150
Ford-Fulkerson example
3
2 4 4

𝐷 = (𝑉, 𝐴) 10 7 9
10 2 0 8 6 6 10

9 9 10
s 10 3 9 5 10 t

Flow value = 19

Residual graph 3
2 1 4
𝐷𝑓 = (𝑉, 𝐴𝑓)
9
1
10 2 7 6 1

s 1 3 9 5 10 t
9
05-02-2018 Combinatorial Optimization 151
Ford-Fulkerson example
3
2 4 4

𝐷 = (𝑉, 𝐴) 10 7 9
10 2 0 8 6 6 10

9 9 10
s 10 3 9 5 10 t

Cut capacity = 19 Flow value = 19

Residual graph 3
2 1 4
𝐷𝑓 = (𝑉, 𝐴𝑓)
9
1
10 2 7 6 1

s 1 3 9 5 10 t
9
05-02-2018 Combinatorial Optimization 152
Ford-Fulkerson: Analysis.

Theorem: Min cut = Max flow , and FF algorithm finds them.

Proof: Clearly, Min cut ≥ Max flow. (*)


Other direction:
When algorithm terminates, it gives a cut C and a flow F for which
value(C) = value(F) 
min cut ≤ value(C) = value(F) ≤ max flow (**)
(*) + (**)  min cut = value(C) = value(F) = max flow

05-02-2018 Combinatorial Optimization 153


Ford-Fulkerson: Analysis.

Corollary:
If all capacities are integer then there is integer maximum
flow.
Proof:
If all capacities are integer then the flow found by FF is
integer.

05-02-2018 Combinatorial Optimization 154


Ford-Fulkerson: Analysis.

🤔 Does FF algorithm always terminate?

It depends on the capacities:


Integer capacities:
▪ Yes, since the flow value increases by at least 1 in every iteration.

Rational capacities:
▪ Yes, since the flow value increases by at least some small epsilon in every iteration.

Real valued capacities: (Note: not realistic in practice)


▪ See a nonterminating example in
https://en.wikipedia.org/wiki/Ford%E2%80%93Fulkerson_algorithm
▪ But it will terminate with a small modification of the algorithm. (See next slides.)

05-02-2018 Combinatorial Optimization 155


Ford-Fulkerson: Analysis.

🤔 How about the running time?


𝑂(𝑛) per iteration, but how many iterations can we have?
# iterations may be as large as max flow value!

1000 1000

s 1 t

1000 1000

Example: 2000 iterations if the algorithm always chooses the arc in the middle.
Is there an algorithm for which the running time only depends on the size
of the graph, and not on the capacities? Yes! Next slides.

05-02-2018 Combinatorial Optimization 156


Shortest augmenting path algorithm

By Edmonds-Karp-Dinitz (1970)

Algorithm:
Apply the FF algorithm but always choose the path with the
minimum number of edges.

Let n be the number of vertices and let m be the number of edges.


Theorem
Shortest augmenting path algorithm takes 𝑂(𝑛𝑚) iterations.

05-02-2018 Combinatorial Optimization 157


Shortest augmenting path, example
Iteration 1

Layer Update

Netwerk Find augmenting path 1st Residual graph

05-02-2018 Combinatorial Optimization 158


Shortest augmenting path
Iteration 2

Layer Update

1st Residual Find augmenting path 2nd residual

05-02-2018 Combinatorial Optimization 159


Shortest augmenting path
Iteration 3

Layer Update

2nd Residual Find augmenting path 3rd residual


05-02-2018 Combinatorial Optimization 160
Shortest augmenting path Iteration 4

Layer Update

3rd residual Find augmenting path 4th residual


05-02-2018 Combinatorial Optimization 161
Shortest augmenting path

No more path from 𝑠 to 𝑡.

Max flow found 

Maximum flow

4th residual
05-02-2018 Combinatorial Optimization 162
Analysis
Let 𝒅(𝒔, 𝒖) be the depth of vertex 𝑢 in the residual graph (i.e., the number of
arcs on the shortest path between 𝑠 and 𝑢).
0

1
augment relayer

3
Example. Points 𝑓 and 𝑡 move down
Lemma 1 For any point 𝑢, the depth 𝑑(𝑠, 𝑢) never decreases.

Proof

• Deleting an arc will not decrease 𝑑(𝑠, 𝑢)

• Adding an arc may only decrease 𝑑(𝑠, 𝑢) if the arc is forward (down). However, we never add a forward arc.

05-02-2018 Combinatorial Optimization 163


Analysis

Call an arc (𝑢, 𝑣) in the residual graph critical if it has the minimum capacity on
the augmenting path.

Lemma 2 𝑑(𝑠, 𝑢) increases between two critical moments of arc (𝑢, 𝑣).
Proof (Hint: check this proof for arc (𝑎, 𝑓) in the example)

If arc (𝑢, 𝑣) is critical then 𝑣 is below 𝑢 and the arc will be removed from the
residual graph. (See (𝑎, 𝑓) in iteration 1).
Arc (𝑢, 𝑣) will only return in the residual when arc (𝑣, 𝑢) is on the augmenting
path in some iteration.
But then 𝑢 must be below 𝑣. Since 𝑣 did not went up, point 𝑢 must have moved
down. (See (𝑎, 𝑓) in iteration 4).

05-02-2018 Combinatorial Optimization 164


Analysis

Theorem
Shortest augmenting path algorithm takes 𝑂(𝑛𝑚) iterations.
Proof
By Lemma 2, each arc is critical only 𝑂(𝑛) times (since the
depth of 𝑣 is less than 𝑛 and 𝑣 goes down between two critical
moments of (𝑢, 𝑣) ).
Further, there is at least one critical arc in each iteration.
 There are no more than 𝑂(𝑛𝑚) iterations.
05-02-2018 Combinatorial Optimization 165
Generalizations of the maximum flow
problem
A polynomial time algorithm for the following versions of the max-flow
problem follows by reducing each version to the original form of the
max-flow problem. The concept of polynomial time reduction will be
explained next week.
a) The network has many sources and many sinks.
b) The network is undirected.
c) The nodes as well as the arcs have capacities.
d) The network is undirected and the nodes have capacities
e) There are lower bounds (and no upper bounds) on the flow through
each arc.
(Goal :Find min flow)

05-02-2018 Combinatorial Optimization 166


Generalizations of max flow
a) The network has many sources and many sinks.
b) The network is undirected

(a) Add two vertices and make these the new source and sink. Add an arc from 𝑠 to each 𝑠𝑖 and
an arc from each 𝑡𝑖 to 𝑡.
infinite capacity

s1
t1
s2 S t
t2
s3
s4

(b) Replace each edge by two arcs. Give each arc the capcaity of the edge.

capacity c(e)
both arcs capacity c(e)
05-02-2018 Combinatorial Optimization 167
Generalizations of max flow
c) The nodes as well as the arcs have capacities.
d) The network is undirected and the nodes have capacities.

(c) Split each node in two nodes and add an arc.

capacity 𝑏(𝑣) capacity 𝑏(𝑣)

(d) Do both operations (b) and (c).

capacity 𝑏(𝑣) capacity 𝑏(𝑣)

05-02-2018 Combinatorial Optimization 168


Generalizations of max flow
e) There are lower bounds (and no upper bounds) on the flow through each arc.
Goal: Find min flow

Algorithm: Step 1: First find a feasible flow (= easy)


Step 2: Iteratively, reduce along flow reducing paths.
3

[0] [1] [0] [1]


[3] [3]
s [1] [3]
t s [1] [3]
t

3 Feasible flow of value 6

2 3 1
[0] [1]
[3]
2
s t s [1] [3]
t
1 3
2
Residual graph: New (and minimum) flow.
Numbers are upperbounds. No number means no
(or infinfite) upper bound.
Add flow of value 2 over the path.
05-02-2018 Combinatorial Optimization 169
Min cost flow
Problem:
Given the network with costs and capacities and a flow value 𝑣, we need
to find an 𝑠 − 𝑡 flow of value 𝑣 of minimum cost.
a capacity
2
4
1 1 5
cost per unit flow
2
2 3
s t
2

1 2
1

b
Example:
Sending 2 units over the path 𝑠, 𝑎, 𝑡 costs 2・4+2・5=18.
Is this the cheapest flow of value 2?
05-02-2018 Combinatorial Optimization 170
Min cost flow
capacity

2 cost per unit flow


4
1 1 5

2
2 3
s t
2

1 2
1

Algorithm:
Step 1: Find a feasible flow of value v (for example by FF). Construct residual graph (negative of the cost
for reverse arcs)
Step 2: While there is a negative-cost cycle in residual graph:
Add flow over the cycle. How to find?
e.g. Bellman Ford algorithm.
Update the residual graph.
(Not in this course)
05-02-2018 Combinatorial Optimization 171
Min cost flow
2
Example: Find a min cost flow of value 2.
2
2 1 1
4
1 1 5 2
2 s t
2 3 2
s t
2
1
1 2
1
1
Given network. Initial flow of value 2.
Cost = 2・4+2・5=18

2
-4
1 1 -5
2
2 3
s t
2

1 2 Is there a negative cycle?


1

1st residual graph


05-02-2018 Combinatorial Optimization 172
Min cost flow
1
2
-4
1 1 -5
2
2 1 1
2 3
s t 2 1
2 s t
2
1 2
1
1

1st residual graph.


Network with the updated flow.
Negative cycle of cost: -5+2+2= -1
Cost is reduced by 1. New cost=17.
Min residual capacity is 1.Send 1 unit along this cycle.

1
-4 1 5
1 1
2 -5
-2 3
s t
2

1 -2
1 Is there a negative cycle?
2nd residual graph
05-02-2018 Combinatorial Optimization 173
Min cost flow
1 1
-4 1 5
1 1
2
2 -5 1 1
-2 3
s t 2
2 s t
2
1 -2
1
1 1
2nd residual graph.
Network with the updated flow.
Negative cycle of cost: 1-2-4= -5

Min residual capacity is 1. Send 1 unit along this cycle. Cost is reduced by 5. New cost=12.

4 1
1 5
-4 1 1
1
1 -5
s 2 3 t
1
-1 Is there a negative cycle?
1
-2
1
1 No. -> The flow has minimum cost.
3rd residual graph
05-02-2018 Combinatorial Optimization 174
Min cost flow:
Theorem:

A flow is a minimum cost flow (of given value v)



Residual graph has no negative-cost cycles

Proof (half of it):

 True, since otherwise we can reduce the cost of the flow by adding flow over the cycle.

 .... Omitted.

Corollary:

The algorithm always returns the minimum cost flow.

05-02-2018 Combinatorial Optimization 175

You might also like