Вы находитесь на странице: 1из 14

Weighted Matching-Algorithms,

Hamiltonian Cycles and TSP


Graphs & Algorithms
Lecture 6
TexPoint fonts used in EMF.
Read the TexPoint manual before you delete this box.: AAAAAAAAAAAA
Weighted bipartite matching
Given:
K
n, n
(complete bipartite graph on 2n vertices)
n n weight matrix with entries w
i, j
0
Want: perfect matching M maximizing the total
weight

Weighted cover (u, v): choice of vertex labels
u = u
1
,,u
n
and v = v
1
,,v
n
, u, v 2 R
n

such that, for all 1 i, j n, we have
u
i
+ v
j
w
i, j
.
Cost c(u, v) :=
i
u
i
+ v
i
.
2
Duality: max. weighted matching
and min. weighted vertex cover
For any matching M and weighted cover (u, v) of a
weighted bipartite graph, we have
w(M) c(u, v) .
Also, w(M) = c(u, v) if and only if M consists of
edges {i, j} such that u
i
+ v
j
= w
i, j
. In this case, M
and (u, v) are optimal.
Equality subgraph G
u, v
of a fixed cover (u, v):
spanning subgraph of K
n, n
,

{i, j} 2 E(G
u, v
) , u
i
+ v
j
= w
i, j
.
Idea: a perfect matching of G
u, v
corresponds to a
maximum weighted matching of K
n, n
.
3
Hungarian Algorithm
Kuhn (1955), Munkres (1957)
4
Wei ght edBi par t i t eMat chi ng(K
n ,n
= (A [ B, A B), w[n, n])
1 for each i 2 [n]
2 do u[i ] max
j
w[i , j ]
3 v[i ] 0
4 whi l e G
u ,v
has no perfect mat ching
5 do X minimum vert ex cover of G
u ,v
6 " min

u[i ] +v[j ] - w[i , j ] : fi , j g 2 (A n X) (B n X)

7 for each i 2 A n X
8 do u[i ] u[i ] -"
9 for each i 2 B \ X
10 do v[i ] v[i ] + "
11 r et ur n perfect mat ching of G
u ,v
Correctness of the Hungarian Method
Theorem
The Hungarian Algorithm finds a maximum weight
matching and a minimum cost cover.
Proof
The statement is true if the algorithm terminates.
Loop invariant: consider (u, v) before and (u', v') after
the while loop
(u, v) is cover of G ) (u', v') is a cover of G
c(u, v) c(u', v') + c w(M)
For rational weights, c is bounded from below by an
absolute constant.
In the presence of irrational weights, a more careful
selection of the minimum vertex cover is necessary.
5
Hamiltonian Cycles
A graph on n vertices is Hamiltonian if it contains a
simple cycle of length n.
The Hamiltonian-cycle problem is NP-complete
(reduction to the vertex-cover problem).
The nave algorithm has running time O(n!) = O(2
n
).
What are sufficient conditions for Hamiltonian
graphs?
Theorem (Dirac 1952)
Every graph with n 3 vertices and minimum
degree at least n/2 has a Hamiltonian cycle.

6
Suppose G is a graph with o(G) > n/2 that contains no
Hamiltonian cycle.
Insert as many edges into G as possible
Embedding of G into a saturated graph G that
contains a Hamilton path



Neighbourhood I(x
1
) yields n/2 forbidden neighbours for
x
n
in {x
1
, , x
n 2
}.
Since x
n
cannot connect to itself, there is not enough
space for all of its n/2 neighbours.

Proof of Diracs Theorem (Psa)
x
1
x
2
x
n
x
i-1
x
i
x
3
Weaker degree conditions
Let G be a graph on n vertices with degrees
d
1
d
n
.
(d
1
, , d
n
) is called the degree sequence
An integer sequence (a
1
, , a
n
) is Hamiltonian if
every graph on n vertices with a pointwise greater
degree sequence (d
1
, , d
n
) is Hamiltonian.
Theorem (Chvtal 1972)
An integer sequence (a
1
, , a
n
) such that
0 a
1
a
n
< n and n 3 is Hamiltonian iff,
for all i < n/2, we have: a
i
i ) a
n i
n i.

8
Traveling Salesman Problem (TSP)
Given n cities and costs c(i, j) 0 for going from city
i to city j (and vice versa).
Find a Hamiltonian cycle H*of minimum cost
c(H*) =
e 2 E(H*)
c(e) .
Existence of a Hamiltonian cycle is a special case.
Brute force: n! = O(n
n + 1/2
e
-n
) time complexity
Better solutions:
polynomial time approximation algorithm for TSP with
triangle inequality approximation ratio 2
optimal solution with running time O(n
2
2
n
)

9
Approximation algorithms
Consider a minimization problem. An algorithm ALG
achieves approximation ratio (n) 1 if, for every
problem instance P of size n, we have
ALG(P) / OPT(P) (n) ,
where OPT(P) is the optimal value of P.
An approximation scheme takes one additional
parameter c and achieves approximation ratio
(1 + c) on every problem instance.
A polynomial time approximation (PTAS) scheme runs in
polynomial time for every fixed c 0.
The running time of a fully polynomial time
approximation scheme (FPTAS) is also polynomial in c
-1
.
10
2-approximation algorithm for TSP
Running time
Prim's algorithm with Fibonacci heaps: O(E + V logV)
Kruskal's algorithm: O(E logV)
11
An optimal TSP algorithm
For each S {2,,n} and k 2 S, define
P(S, k) "minimum cost of a Hamiltonian path
on S starting in 1 and ending in k"
Let V = {1,,n} and positive costs c(i, j) be given.
TSP = min{P(Vn{1}, k) + c(k, 1) : k 2 Vn{1}}
Recursive computation of P(S, k) :
P({k}, k) = c(1, k)
P(S, k) = min{P(Sn{k}, j) + c(j, k) : j 2 Sn{k}}
Compute P(S, k) buttom-up (dynamic programming)
Number of distinct P(S, k) values is (n 1)2
n 2
.
Each time at most n operations are necessary.

12
Example
13
More positive and negative results on
TSP
Slight modifications yield a 1.5-approximation
algorithm for TSP with A-inequality. (Exercise)
Arora (1996) gave a PTAS for Euclidean TSP with
running time n
O(1/c)
.
For general TSP, there exists no polynomial time
approximation algorithm with any constant
approximation ratio 1, unless P = NP. (Exercise)
14

Вам также может понравиться