Академический Документы
Профессиональный Документы
Культура Документы
Optimization Methods
One-Dimensional Unconstrained Optimization
Golden-Section Search
Quadratic Interpolation
Newton's Method
Multi-dimensional
Optimization
f ' ( xi ) = 0
f (x) = 0
f ' ( xi )
xi +1 = xi
f " ( xi )
xi +1 = xi H i1f (xi )
f " ( xi ) 1 f ' ( xi )
Hi is the Hessian
matrix (or matrix of 2nd
partial derivatives) of f
evaluated at xi.
Newton's Method
1
i
x i +1 = x i H f ( x i )
This method Converges in quadratic fashion.
It May diverge if the starting point is not close
enough to the optimum point.
It is very Costly to evaluate H-1.
Marquardt Method
Idea
When a guessed point is far away from the optimum
point, use the Steepest Ascend method or Cauchys
method.
As the guessed point is getting closer and closer to the
optimum point, gradually switch to the Newton's
method.
In any given problem it is not known whether the
chosen initial point is away from the minimum or close
to the minimum.
So, we need a method that takes advantages of both.
Marquardt Method
The Marquardt method achieves the objective by
modifying the Hessian matrix H in the Newton's
Method in the following way:
x i +1
~ 1
= x i H i f (x i )
where
~
H i = H i + iI
Marquardt Method
Wheni is large
~
H i = H i + iI i I
x i +1
1
~ 1
= x i H i f ( x i ) x i +1 = x i
f (x i )
Minimum point
) (
) (
2
f ( x, y ) = x + y 11 + x + y 7
f
= 4 x x 2 + y 11 + 2 x + y 2 7 = 4 x 3 + 4 xy 42 x + 2 y 2 14
x
f
= 2 x 2 + y 11 + 4 y x + y 2 7 = 2 x 2 22 + 4 xy + 4 y 3 26 y
y
2
2 f
2 f
2 f
2
2
= 12 x + 4 y 42;
= 4 y + 12 y 26;
= 4x + 4 y ;
2
2
x
y
xy
12 x 2 + 4 y 42
4x + 4 y
H=
2
4 y + 12 y 26
4x + 4 y
0
42.0
T
At (0,0), f = [ 14 22] and H =
.
26
0
) (
2
f ( x, y ) = x + y 11 + x + y 7
2
0
42.0
At (0,0), f = [ 14 22] and H =
.
26
0
T
42
0
1 0 14
+ 100
=
26
0 1 22
0
58 0
=
0 74
14
1 74 0 14 0.241
22 4292 0 58 22 0.297
) (
2
f ( x, y ) = x + y 11 + x + y 7
2
2
2
.
152
23
.
754
4
+
4
4
+
12
26
x
y
y
y
) (
2
f ( x, y ) = x 2 + y 11 + x + y 2 7
2.152
40.115
At (0,0), f = [ 23.64 29.21] and H =
2
.
152
23
.
754
40.115
2.152
1 0 23.64
=
+ 50
23.754
0 1 29.21
2.152
9.885 2.152
=
2.152 26.246
23.64
2.738
==
29.21
1.749
f i
f i 1
S i 1
(i) f ( x ) = x 2 x + 100
2
Example Solution
(i)
f ( x ) = x 2 2 x + 100
f ' ( x) = 2 x 2 = 0 x = 1
f " (1) = 2 ( + ve) x = 1 is a local minima
(ii)
f ( x, y ) = 2 xy + 1.5 y 1.25 x 2 2 y 2
f
f
= 2 y 2.5 x ,
= 2 x + 1.5 4 y
x
y
Setting f = 0, we have
2 y 2.5 x = 0
2 x + 1.5 4 y = 0
Solving the system yields x = 0.5 and y = 0.625
We still have to test if the point is a local maxima, minima or saddle point
(continue next page )
f
= 2 y 2.5 x,
x
2 f
xx
H= 2
f
yx
f
= 2 x + 1.5 4 y
y
2 f
xy 2.5 2
H = (2.5)(4) (2)(2) = 6
=
2
f 2
4
yy
2 f
Since H > 0 and 2 < 0, the point (0.5,0.625) is a local maxima.
x
(iii)
f ( x, y ) = ( x 2) 2 ( y 3) 2
f
f
= 2( x 2) = 2 x 4,
= 2( y 3) = 2 y + 6
x
y
2 f
xx
H= 2
f
yx
2 f
xy 2 0
H = 4
=
2
f 0 2
yy
2 f
Since H < 0 but 2 > 0 (i.e, H is indefinite),
x
the stationary point is a saddle point.
(iv)
f ( x, y, z ) = x 2 + y 2 + z 2 2 xz + xy 3 yz + 10
f
f
f
= 2 x 2 z + y,
= 2 y + x 3z,
= 2z 2x 3y
x
y
z
2 f
x2x
f
H=
yx
2
f
zx
2 f
xy
2 f
yy
2 f
zy
2 f
xz 2
1 2
2
f
1
2
3
=
yz
2 3 2
2
f
zz
1 2
2
2 3
H= 1
2 3 2
H11 = 2 > 0, H 22
H33
Forward
Elimination
2
2 1
0 1.5
2
0 0 4 / 1.5
2 1
=
=42 =2>0
1 2
2
1 2 2 1
2
4
= 1
2 3 = 0 1.5
2 = ( 2)(1.5)( ) = 8 < 0
1.5
2 3 2
0 0 4 / 1.5
f ( x, y ) = x 3 + y 3 3 xy
1.
2.
3.
4.
(0, 0)
(1, 0)
(-1, -1)
(1, 1)
Exercise;
solution
f ( x, y) = x 3 + y 3 3xy
6 x 3
H=
3
6
0 3
T
At (0,0), f = [0 0] and H =
.
3 0
f
= 3 x 2 3 y,
x
f
= 3 y 2 3x,
y
6 3
At (1,1), f = [0 0] and H =
.
3 6
Since f = 0 and h1,1 > 0 and H = 36 9 = 27, (1,1) is a local minima.
T
Summary
Gradient What it is and how to derive
Hessian Matrix What it is and how to derive
How to test if a point is maximum, minimum, or
saddle point
Steepest Ascent Method vs. Conjugate-Gradient
Approach vs. Newton Method