Вы находитесь на странице: 1из 5

Professor Dr.

Sebastian Engell
Process Dynamics and Operations Group

4.2 Problems with Equality Constraints


General form:

min f (x )
x

h1(x )

subject to: h(x ) = = 0

hm (x )

Simplest case: the m equality constraints can be solved for m


variables contained in x:
h(x ) = 0

~
x1 = h (x 2 ),

x1
x = ,
x2

x1 R m

~
New problem of dimension (n-m): min f (x 2 , h (x 2 ))
x2

Process Optimization

20114.2.1

Professor Dr. Sebastian Engell


Process Dynamics and Operations Group

Reduced Gradient Method


Reduced cost function: f (x 2 , h~(x 2 )) =: f~(x 2 )
Idea:

( )

Apply the necessary optimality condition x 2 f~ x *2 = 0


and solve it for x2*

Reduced gradient:

( )

~
x 2 f x *2

Example:

T
h~
x1f
= x 2 f +
x2

h 1 h
f
= x2f

x1
x1 x 2

min f (x ) with f (x ) = 4 x12 + 5 x22 , h(x ) = 2 x1 + 3 x2 6 = 0


x

x1 = 3 1.5 x2 =: h~(x2 ) f~(x2 ) = 14 x22 36 x2 + 36


x f~(x2 ) = 10 x2 12 x1 = 0 , solve for x1, x2 :
2
Process Optimization

1.07
x*

1.29
20114.2.2

Professor Dr. Sebastian Engell


Process Dynamics and Operations Group

reduced gradient + reduced cost function:


unconstrained minimization, can be solved numerically
Problems:
the nonlinear constraints can often not be solved analytically
several solutions may exist (all must be considered)
Alternative approach: Lagrange multipliers
Extend the cost function by the equality constraints
Lagrangian: L(x, ) = f (x ) + T h(x )
with the Lagrangian multipliers:
Process Optimization

T = (1, 2 , m )
20114.2.3

Professor Dr. Sebastian Engell


Process Dynamics and Operations Group

Necessary Conditions for a Minimum (x*,*)


x f

h
x L( x*, * ) = x f ( x * ) + ( x * ) * = 0
x

L( x*, * ) = h( x * ) = 0

x2
dx

dh/dx
f

Example: min f (x )

dh/dx
x f

x* dx

h(x)=0

with

f (x ) =

L( x , ) = 4 x12 + 5 x22 + (2 x1 + 3 x2 6 )

4 x12

+ 5 x22 ,

x1

h(x ) = 2 x1 + 3 x2 6 = 0

8 x* + 2 *
= 0 , L ( x*, * ) = 2 x* + 3 x* 6 = 0
1
Lx ( x*, * ) =

2
1
10 x* + 3 *

2
1.07
30

Solve for x1*, x2*, and * :


* = , x*

7
1.29
Process Optimization

20114.2.4

Professor Dr. Sebastian Engell


Process Dynamics and Operations Group

Sufficient Condition
Assume (x*,*) fulfills the necessary conditions:
T

v = 0
(
)
x
*
Let v = x x * be a feasible variation, i.e.
x

T
2
Then v xx L(x*, * ) v > 0 is a sufficient condition for local

optimality.
Use of the Lagrange multipliers for sensitivity analysis:
Assume a small variation of h(x): h(x ) =

L( x , , ) = f ( x ) + T (h( x ) )

The minimum is a function of : x* ( ), * ( )

f ( x * )
= *i
i

Process Optimization

i is a measure of the sensitivity of f(x*)


with respect to a variation i of hi(x)
20114.2.5

Вам также может понравиться