Вы находитесь на странице: 1из 11

Mathematical Foundations -1- Optimization

John Riley September 16, 2013


Unconstrained Optimization
Key Ideas: First order conditions (FOC), second order conditions (SOC) necessary and sufficient
conditions

Necessary condition for a maximum xe 3
Sufficient conditions xe 5
Necessary condition for a maximum
n
xe 6
Sufficient conditions
n
xe 7
Differentiable and continuously differentiable functions 8

Mathematical Foundations -2- Optimization

John Riley September 16, 2013
Lemma 1: If
0
( ) 0
df
x
dx
> then for some o neighborhood,
0
( , ) N x o , if
0
x x > then
0
( ) ( ) f x f x > and if
0
x x < then
0
( ) ( ) f x f x <
Proof:
If
0
( ) 0
df
x
dx
> then for any 0 c > there exists a deleted o neighborhood,
0
( , )
D
N x o such that for all x
in this deleted neighborhood

0
0 0
0
( ) ( )
( ) ( )
df f x f x df
x x
dx x x dx
c c

< < +


Choose
0
1
( )
2
df
x
dx
c = . Then

0
0
0
1 ( ) ( )
0 ( )
2
df f x f x
x
dx x x

< <

for any x in
0
( , )
D
N x o .
Q.E.D.
Lemma 2: If
0
( ) 0
df
x
dx
< then for some o neighborhood,
0
( , ) N x o , if
0
x x > then
0
( ) ( ) f x f x < and if
0
x x < then
0
( ) ( ) f x f x >
(proof almost identical)
Mathematical Foundations -3- Optimization

John Riley September 16, 2013
Necessary conditions
xe ( )
x X
Max f x
e
where [ , ] X o | =
For an interior max (
0
int ( , ) x X o | e = )
FOC:
0
( ) 0
df
x
dx
=
Proof by contradiction:
Suppose
0
( ) 0
df
x
dx
> . By Lemma 1 for some o neighborhood,
0
( , ) N x o ,
if
0
x x > then
0
( ) ( ) f x f x >
But then ( ) f x cannot take on its maximum at
0
x after all.
By Lemma 2 if if
0
x x < then
0
( ) ( ) f x f x >
QED



Mathematical Foundations -4- Optimization

John Riley September 16, 2013
SOC
2
0
2
( ) 0
d f
x
dx
s
Proof by contradiction:

Suppose
0
( ) 0
d df
x
dx dx
> . By Lemma 1,
0
0
( ) ( )
0
df df
x x
dx dx
x x

>

, in some deleted o neighborhood of


0
x .
For any
0 0
( , ) x x x o e + it follows that
0
( ) ( ) 0
df df
x x
dx dx
> . Appealing to the FOC, (
0
( ) 0
df
x
dx
= ) it
follows that ( ) 0
df
x
dx
> . Then, by Lemma 1
0
( ) ( ) f x f x > for
0
x x > in some o neighborhood of
0
x .

Class exercise:
What is the necessary condition for a maximum on the boundary?
Mathematical Foundations -5- Optimization

John Riley September 16, 2013
Sufficient conditions for a local maximum
0
( ) 0
df
x
dx
= ,
0
( ) 0
d df
x
dx dx
<
By Lemma 2, in some deleted delta neighborhood of
0
x
0
0 0
( ) ( ) ( )
0
df df df
x x x
dx dx dx
x x x x

= <

.
Therefore ( ) 0
df
x
dx
< on
0 0
( , ) x x o + and so by Lemma 2 ( ) f x is strictly decreasing on
0 0
( , ) x x o + .
By an almost identical argument ( ) f x is strictly increasing on
0 0
( , ) x x o .
Mathematical Foundations -6- Optimization

John Riley September 16, 2013
n
xe ( )
x X
Max f x
e
where X is a convex subset of
n


Necessary conditions (FOC), SOC :
n
f Define ( ) ( ( )) g f x = where
0 1 0
( ) ( ) x x x x = +

1 0
( ) ( ( )) ( )
dg f
x x x
d x

c
=
c

1 0
( ( )) ( )
f
x x x
x

c
' =
c

1 0
( ) ( ( ))
f
x x x
x

c
' =
c


FOC (interior max)
0
( ) 0
f
x
x
c
=
c


( )
d dg
d d


=


1 1
2
1 0 1 0 1 0 1 0 1 0
( ( )) ( ( ))
( ) ( ) ( ) ( ) ( )
( ( )) ( ( ))
i j
n n
d f f
x x
d x x x
f
x x x x x x x x x x
x x
d f f
x x
d x x x

c c c ( (
( (
c c c
( (
( (
(
c
( (
' ' ' = =
(
( (
c c
(

( (
( (
c c c
( (
( ( c c c




SOC The Hessian Matrix of second partial derivatives must be negative semi definite.
Mathematical Foundations -7- Optimization

John Riley September 16, 2013
Sufficient Conditions
If the Hessian matrix is negative definite then f has a local maximum at
0
x
Strictly quasi-concave functions
If the FOC and SOC both hold at
0
x does this imply that the function takes on its maximum at
0
x ?
Consider
3 4
( ) 4 f x x x =
Concave functions
Proposition: If :
n
f X c is concave and differentiable and the FOC holds at
0
x then f takes
on its maximum at
0
x .
Mathematical Foundations -8- Optimization

John Riley September 16, 2013
Puzzle: If a function is everywhere differentiable and has a maximum at
0
x it is tempting to think that
the slope of the function must be decreasing at
0
x . This intuition comes from the thought that a
differentiable function must have a slope that varies continuously. However, as the following example
illustrates, this is not the case.
2
0, 0
( )
(2 sin(1/ )), 0
x
f x
x x x
=
=

+ =

.
Differentiating,
( ) 2 (1 sin(1/ )) cos(1/ )
df
x x x x
dx
= + + , 0 x =
The cosine varies from -1 to +1 with every 2t radians thus as x approaches zero and so 1/x increases
ever more rapidly, the second term fluctuates from -1 to +1 ever more rapidly while the first term
approaches zero. Thus the derivative does not converge.
Mathematical Foundations -9- Optimization

John Riley September 16, 2013
2
0, 0
( )
(2 sin(1/ )), 0
x
f x
x x x
=
=

+ =


Note next that
2 2
3 ( ) x f x x s s . Thus ( ) f x has its maximum at 0 x = .
Moreover
2 2
3 ( ) (0) x f x f x s s .
Therefore
2 2
3 ( ) (0)
0
x f x f x
x x x

s s

.
Since the upper and lower bounds approach zero it follows that the ratio approaches zero in the limit.
Therefore (0) 0
df
dx
= .
Mathematical Foundations -10- Optimization

John Riley September 16, 2013
The graph of the function is as depicted.
Also shown are the upper and lower bounds.
While the slope of the function oscillates
ever more rapidly between -1 and 1 as 0 x ,
the derivative is well defined at zero.









Mathematical Foundations -11- Optimization

John Riley September 16, 2013
Exercise (for those who like mathematical technicalities.)
Consider the function ( ) (2 sin(1/ ) f x x x = + , 0 x = and (0) 0 f = .
(a) Show that the function is continuous.
(b) Show that the function is strictly increasing at 0 x = .
(c) Show that the slope is unbounded from below and above in any delta-neighborhood of 0 x = .