Вы находитесь на странице: 1из 37

LP Duality and Approximation

Seminar on Approximation Algorithms

Overview
LP duality. LP Duality Some Duality Theory Approximation using LP Duality Primal-Dual Algorithms Examples

Part 1 - LP Duality

Linear Programming - Example

min s.t.

7x1 + x2 + 5x3 x1 x2 + x3 10 5x1 + 2x2 x3 6 x1 , x2 , x3 0

x = (2, 1, 3) is a feasible solution. 7 2 + 1 1 + 5 3 = 30 is an upper bound.

Lower Bound
Question: How can we nd a good lower bound? min s.t. 7x1 + x2 + 5x3 x1 x2 + x3 10 5x1 + 2x2 x3 6 x1 , x2 , x3 0 A lower bound: 7x1 + x2 + 5x3 x1 x2 + 3x3 10 A better lower bound: 7x1 + x2 + 5x3 (x1 x2 + 3x3 ) + (5x1 + 2x2 x3 ) 10 + 6 = 16

Lower Bound

min s.t.

7x1 + x2 + 5x3 x1 x2 + x3 10 5x1 + 2x2 x3 6 x1 , x2 , x3 0 y1 y2

Assign a non-negative coecient yi to every primal inequality such that y1 (x1 x2 + x3 ) + y2 (5x1 + 2x2 x3 ) 7x1 + x2 + 5x3 Lower bound is 10y1 + 6y2 .

LP Duality
The problem of nding the best lower bound can be formulated as a linear program.

Primal min s.t. 7x1 + x2 + 5x3 x1 x2 + x3 10 5x1 + 2x2 x3 6 x1 , x2 , x3 0

Dual max s.t. 10y1 + 6y2 y1 + 5y2 7 y1 + 2y2 1 3y1 y2 5 y1 , y2 0

LP Duality
min s.t.
n j =1 cj xj n j =1 aij xj

max bi i j s.t.

m i=1 bi yi m i=1 aij yi

cj

j i

xj 0

yi 0

Or, in a compact formulation: min s.t. cT x Ax b x0 For every x and y : cT x bT y . Thus, Opt(Primal) Opt(Dual). The dual of the dual is the primal.
8

max bT y s.t. AT y c y0

Example
Primal min s.t. 7x1 + x2 + 5x3 x1 x2 + x3 10 5x1 + 2x2 x3 6 x1 , x2 , x3 0 Dual max s.t. 10y1 + 6y2 y1 + 5y2 7 y1 + 2y2 1 3y1 y2 5 y1 , y2 0

x = (7/4, 0, 11/4) and y = (2, 1) are feasible solutions. 7 7/4 + 0 + 5 11/4 = 10 2 + 6 1 = 26 Thus, x and y are optimal.
9

Max Flow vs. Min s,t-cut


Path(s, t) = set of directed paths from s to t. fi = ow from s to t in path Pi . Max Flow: max s.t. Dual: min s.t. Implicit de 1. When de {0, 1} we get Min s, t-cut. Opt(Dual) = Opt(Min s, t-cut).
10 i:Pi Path(s,t) i:ePi

fi e E Pi Path(s, t)

fi ce

fi 0
eE ce de ePi

de 1

Pi Path(s, t) e E

de 0

LP-duality Theorem
Theorem: Opt(primal) is nite Opt(dual) is nite If x and y are optimal then
n m

cj x j =
j =1 i=1

bi yi

11

Weak Duality Theorem


A weaker result suces for our purposes. Theorem: If x and y are feasible then
n m

cj xj
j =1 i=1

bi yi

Proof:
n n m m n m

cj xj
j =1

(
j =1 i=1

ai,j yi )xj =

(
i=1 j =1

ai,j xj )yi
i=1

bi yi

12

Complementary Slackness Conditions

x and y are optimal Primal: Dual: j, xj > 0 i, yi > 0


m i=1 aij yi = cj n j =1 aij xj = bi

13

Complementary Slackness Conditions


Proof: By the LP Duality Theorem:
n n m m n m

cj xj =
j =1

(
j =1 i=1 m i=1

ai,j yi )xj = ai,j yi )xj = 0.


m i=1

(
i=1 j =1

ai,j xj )yi =
i=1

bi yi

Thus,

n j =1 (cj

For any j , either xj 0 and cj Thus, j, xj > 0 Remarks:


m i=1

ai,j yi 0.

aij yi = cj

Similar arguments work for the dual conditions. For other direction read the proof upwards

14

LP Duality - Summary
(P ) min s.t. Pn
j =1 cj xj Pn j =1 aij xj bi

(D ) i j

max s.t.

Pm

i=1 bi yi Pm i=1 aij yi

cj

j i

xj 0

yi 0

Weak Duality Theorem: bT y cT x Complementary Slackness Conditions: x and y are optimal Primal: Dual: j, xj > 0 i, yi > 0
m i=1 aij yi = cj n j =1 aij xj = bi

15

Part 2 Approximation Using LP Duality

16

Integer Programming
NP-hard Branch and Bound LP relaxation: min s.t. cT x Ax b x0 x Nn Observation: Opt(LP) Opt(IP). Denition: Integrality Gap = Opt(IP)/Opt(LP)
17

Using LP for Approximation


Finding a good lower bound - Opt(LP) Opt(IP). Integrality gap is the best we can hope for. Techniques: Rounding: Solve LP-relaxation and then round solution. Primal Dual: Find a feasible dual solution y, and a feasible integral primal solution x such that cT x r bT y r Opt(LP).

18

The Primal Dual Approach


Problem:
(IP) min s.t. Pn
j =1 Pn j =1

cj xj aij xj bi i j

xj {0, 1}

Idea: Find an integral primal solution x and a dual solution y such that: c T x r bT y By the Weak Duality Theorem: cT x r bT y r Opt(LP) r Opt(IP) Question: How do we nd such solutions?
19

The Primal Dual Approach


Idea: Find an integral primal solution x and a dual solution y that satisfy the following: Primal Cond.: j, xj > 0
m i=1

aij yi = cj
n j =1

Relaxed Dual Cond.: i, yi > 0 In this case:


n n m m

bi

aij xj r bi

cj xj =
j =1 j =1 i=1

aij yi

xj =
i=1

j =1

aij xj yi r

RD

bi yi
i=1

Question: How do we nd such solutions?

20

Primal Dual Schema


Dual: a packing of primal constraints (where yi is the coecient of the ith constraint) in order to get a good lower bound. We construct y such that in each advancement step the change in y ensures that the relaxed dual conditions are satised. We use only good constraints: j =1 aij xj bi , for which n j =1 aij xj r bi (for any minimal solution). The elements that are contained in the integral (minimal) primal solution obey the primal conditions. x contains elements that are paid for, i.e., m xj = 1 i=1 aij yi = cj .
n

21

Vertex Cover
Denition: Instance: An undirected graph G = (V, E ) A weight function c : V R+ Solution: Measure: U V
uU

c (u )

22

Vertex Cover
xu = 1 i u is taken into the cover. IP formulation: (VC) min s.t. (LP-relaxation) The Dual: max s.t.
eE uU

c(u)xu (u, v ) E u V

xu + xv 1 xu {0, 1} xu 0

ye ye c(u) u V e E

e:ue

ye 0

23

2-approximation Algorithm
1. U ; y 0 2. For each edge e = (u, v ) E : 3. ye min c(u) e :ue ye , c(v ) e :ve ye 4. U U argmin c(u) e :ue ye , c(v ) e :ve ye 5. Output U Remark: We can construct U at the end.

24

2-approximation Algorithm - Analysis


There are no uncovered edges or over packed vertices. Thus, y and x = x(U ) are feasible. c(u)xu =
uV uU

c (u ) =
uU e:ue

RD

ye =
eE uU e

ye 2
eE

ye

Thus, x is 2-approx. x and y satisfy the relaxed complemetary slackness conditions with r = 2: Primal: Relaxed Dual: xu = 1 ye > 0
e:ue

ye = c(u)

1 xu + xv 2

25

Generalized Vertex Cover


Denition: Instance: An undirected graph G = (V, E ) A weight function c : V E R+ Solution: Measure: U V
uU

c (u ) +

e:eU =

c(e)

GVC is a version of VC in which you pay for uncovered edges. When c(e) = for every e E we get VC.

26

Generalized Vertex Cover


IP formulation:
(GVC) min s.t. P
uV

c(u)xu +

eE

c(e)xe e = (u, v ) E u V e E

xu + xv + xe 1 xu {0, 1} xe {0, 1}

(LP-relaxation)

xu 0, xe 0

The Dual:
max s.t. P
eE

ye ye c(u) u V e E e E

e:ue

ye c(e) ye 0

27

2-approximation Algorithm
Observation: For any minimal solution x: 1 xu + xv + xe 2 Algorithm: 1. U ; y 0 2. For each edge e = (u, v ) E : 3. ye min c(u) e :ue ye , c(v ) 4. Output U = u : c(u) = e:ue ye Remarks: U need not be minimal, but Observation still holds. We implicitly pay for uncovered edges.

e :v e

ye , c(e)

28

2-approximation Algorithm - Analysis


y and x = x(U ) are feasible by construction. x is 2-approx: c(t)xt
tV E

=
ubU

c (u ) +
e:eU =

c(e) ye
e:eU =

=
uU e:ue

ye + ye +
eE ueU

=
RD

ye
e:eU =

2
eE

ye

29

2-approximation Algorithm - Analysis


x and y satisfy the relaxed complemetary slackness conditions: Primal: Relaxed Dual: xu = 1 xe = 1 ye > 0
e:ue

ye = c(u)

ye = c(e) 1 xu + xv + xe 2
X
eE

min s.t.

X
tV E

c(t)xt e = (u, v ) u e

max s.t.

ye ye c(u) u V e E e E

xu + xv + xe 1 xu 0 xe 0

X
e:ue

ye c(e) ye 0

30

Generalized Hitting Set


Denition: Instance: A collection of subsets S of a ground set U A weight function c : U S R+ Solution: Measure: U U
uU

c (u ) +

s: s U =

c (s )

Remark: We must pay for sets that are not hit by U Example: U = {1, 2, 3, 4, 5} S = {{1, 2, 3} , {2, 4} , {3, 5}} u U, c(u) = 3 s S, c(s) = 2

Solution: U = {2, 3}. Optimal solution: U = {2}.

31

Generalized Hitting Set


IP formulation:
(GHS) min s.t. (LP-relaxation) P
uU us

c(u)xu +

sS

c(s)xs s S u, s

xu + xs 1

xu , xs {0, 1} xu , xs 0

The Dual:
max s.t. P
sS

ys ys c(u) u U s S s S

s:us

ys c(s) ys 0

32

-approximation Algorithm
Observation: For any minimal solution x: 1
us

xu + xs = max {|s|}
s S

Algorithm: 1. U ; y 0 2. For each set s S : 3. ys min c(u) s:us ys : u s {c(s)} 5. Return U = u : s:us ys = c(u)

33

-approximation Algorithm - Analysis


y and x = x(U ) are feasible by construction. x is -approx: c(t)xt
tU S

=
uU

c (u ) +
s: s U =

c (s ) ys
s: s U =

=
uU s:us

ys + ys +
sS usU

=
RD

ys
s: s U =

s S

ys

34

-approximation Algorithm - Analysis


x and y satisfy the relaxed complemetary slackness conditions: Primal: Relaxed Dual: xu = 1 xs = 1 ys > 0
s:us

ys = p(u) xu + xs

ys = p(s) 1
us

min s.t.

X
tU S

p(t)xt s S u, s

max s.t.

X
sS

ys ys p(u) u U s S s S

X
us

xu + xs 1

X
s:us

xu , xs 0

ys p(s) ys 0

35

Generalized Hitting Set


Let [1, ]
min s.t. X
tU S

p(t)xt s S u, s

max s.t.

X
sS

ys ys p(u) u U s S s S

X
us

xu + xs 1

X
s:us

xu , xs 0

ys p(s) ys 0

Observation: For any minimal solution x: xu + xs


us

36

Primal Dual Schema - Intuition


Primal: min s.t.
n j =1 cj xj n j =1 aij xj

Dual: max bi i j s.t.


m i=1 bi yi m i=1 aij yi

cj

j i

xj 0

yi 0

An advancement step: Find a vector = (1 , . . . , m ) such that: y + is feasible y + is better than y , i.e., bT (y + ) bT y = bT 0 For any i, if i > 0 then for any solution x, bi n j =1 aij xj r bi
37

Вам также может понравиться