Академический Документы
Профессиональный Документы
Культура Документы
x
^ ^
2
, <
so t hat S*(x, d) is a nonempty subset of S(x
Q
). If {^
fc
) =o is any sequ
ence for which x
k+1
e S*(x
k
, ), k =0, 1, 2, , then (1) implies t hat
sequence {f(x
k
)}~=
0
, which is bounded below, is monotone nonincreasing
and hence t hat | Vf(x
k
) \ * 0 as k > oo. The remainder of the theorem
follows from Condition I V.
COROLLARY 1. (The Steepest Descent Algorithm) If
x
k+1
= x
k
Lj7/(a?
4
), k = 0, 1, 2,
.
then the sequence {x
k
}^
=Q
converges to the point x* which minimizes f.
Proof. I t follows from t he proof of t he convergence theorem t hat
t he sequence {x
k
}^=
0
defined in t he statement of Corollary 1 is such
t hat x
k+1
e S*(x
k
, 1/4Z), k =0, 1, 2, . .
COROLLARY 2. (The Modified Steepest Descent Algorithm) If a
is an arbitrarily assigned positive number, a
m
= a/ 2
m
~\ m= 1, 2, ,
and x
k+1
= x
k
cc
mj
Pf(x
k
) where m
k
is the smallest positive integer
for which
( 2) f(x
k
a
m
/ f(x
k
)) f(x
k
) ^ \ a
mh
\ Ff(x
k
) |
2
,
k = 0, 1, 2, , then the sequence {x
k
}=
0
converges to the point x*
which minimizes f.
Proof. I t follows from t he proof of t he convergence theorem t hat
if x e S(x
Q
) and x=x \ Pf(x), then f(x) f(x) ^ (1/2) | Vf(x) |
2
for
0 ^ ^ 1J2K. If a ^ l/2i , then for t he sequence {x
k
}=
0
in t he state
ment of Corollary 2, m
k
=1 and x
k+1
e S*(x
k
, (l/ 2)a), k = 0, 1, 2, .
If a > 1/2Z", then t he integers m^ exist and a
mfc
> l/4i so t hat
MINIMIZATION OF FUNCTIONS 3
3* Discussion* The convergence theorem proves convergence
under hypotheses which are more restrictive than those imposed by
Curry [1] but less restrictive than those imposed by Goldstein [2].
However, both the algorithms which we have considered would be
considerably easier to apply than the algorithm proposed by Curry
since his algorithm requires the minimization of a function of one
variable at each step. The method of Goldstein requires the assumption
that feC
2
on S(x
0
) and that S(x
0
) be bounded. It also requires
knowledge of a bound for the norm of the Hessian matrix of / on
S(x
0
), but yields an estimate for the ultimate rate of convergence of the
gradient method. It should be pointed out that the modified steepest
descent algorithm of Corollary 2 allows for the possibility of variable
stepsize and does not require knowledge of the value of the Lipschitz
constant K.
The author is indebted to the referee for his comments and
suggestions.
REFERENCES
1. H. B. Curry, The method of steepest descent for nonlinear minimization problems,
Quart. Appl. Math. 2 (1944), 258-263.
'2. A. A. Goldstein, Cauchy's method of minimization, Numer. Math. 4 (2), (1962),
146-150.