Вы находитесь на странице: 1из 109

Nonlinear Programming

Contents

Terminologies
Types of function
Concave
Convex

Types of nonlinear programming problems


Unconstrained
Constrained

Quadratic programming with WinQSB


Nonlinear programming with WinQSB

Terminologies

Feasible region
Maxima

Local

Global

Minima

Local

Global

Critical Point

Feasible Region
Set of all solutions that satisfy
all of the constraints.

Maxima
It is a point such that the value of the
function is greater than all other function
values.
f(d) > f(x)
d is maximum point

Minima
It is a point such that the value of the
function is less than all other function
values.
f(c) < f(x)
c is minimum point

Critical Point
Those points at which maxima or
minima can occur.
f(x)
Global
max.
f(d)<f(x)
Local
min.

f(d)>f(x)
Local
max.

Global
min.

Optimization

n decision variables

feasible region

minimize or maximize
objective function

Definition
A general optimization problem is to
select n decision variables x1,
x2, . . . , xn from a given feasible
region in such a way as to optimize
(minimize or maximize) a given
objective function f (x1, x2, . . . , xn)
of the decision variables.

NonLinear Programming

Objective Function
(Always Nonlinear)
Constraints
(May Be Nonlinear / Linear)

Example
Max./Min. 2X12+3X22
S.T.
X1+X2<3
2X12+X22>5

NLP Problem
The problem is called a nonlinear
programming problem (NLP)

if

the objective function is nonlinear


and/or the feasible region is
determined by nonlinear/linear
constraints.

Mathematically
Maximize
f (x1, x2, . . . , xn) non-linear
subject to:
g1(x1, x2, . . . , xn) b1 L/non-L
...
...
gm(x1, x2, . . . , xn) bm

Applications
Data networks Routing
Production planning
Resource allocation
Computer-aided design
Solution of equilibrium models
Data analysis and least squares formulations
Modeling human or organizational behavior

Functions

Functions
Types related to nonlinear programming:

Convex

Concave

Convex Function

Straight Line

Above or on the function

Definition
When a straight line
is drawn between
any two points on
the graph of the
function, the line lies
on or above the
function.

Graphically

Concave Function

Straight Line

Below or on the function

Definition
When a straight line
is drawn between
any two points on
the graph of the
function, the line lies
below or on the
function.

Graphically

Convex Set

All the points are in the set

A set S is convex if any point on


the line segment connecting any
two points in the set is also in S.

Graphically
The figure shows
examples of convex
sets in two
dimensions.

Non-Convex Set

If a set does not satisfy the


requirements of convex set,
it is a nonconvex set.

Graphically
The figure shows
examples of
nonconvex sets in
two dimensions.

Not
e

An important issue in nonlinear


programming is whether the
feasible region is convex or
not..?

Feasible Region Issues


When all the constraints of a
problem are linear or convex, the
feasible region is a convex set.
If the objective function is convex
and the feasible region defines a
convex set, every local minimum is a
global minimum.

Feasible Region Issues


If the feasible region of a problem is
a nonconvex set, a local minimum
may or may not be global minimum

Test For Concavity


SSuppose that f is twice differentiable
on the open interval I.
If f(x)>0. then f is concave upward at
each point of I.
If f(x)<0. then f is concave
downward at each point of I.

Point of Inflection
A point at which f (x)=0 may or
may not be a point where the function
changes from concave upward on one
side to concave downward to the other
side. But, if the concavity does change
in this manner then, such point is
called point of inflection

Properties of Functions
f(x) is convex, if f(x) 0
f(x) is concave, if f(x) 0
f(x) is convex, if Hessian has all
principal minors non-negative
f(x) is concave, if Hessian has principal
minors with sign (-1)k, k=1,2,

Properties of Functions
For Maximization Problem, the feasible
region is convex & the function is
concave.
For minimization problem:
The feasible region is convex & the
function is also convex.

Unconstrained Problems

Types of NLP Problems

Unconstrained problems
Constrained problems

Unconstrained Problems

Unconstrained extreme points


Newton Raphson

Gradient Method
Steepest
Ascent
Descent

GENERIC UNCONSTRAINTS PROBLEM


Minimize/Maximize

f(x);

no constraint on the decision


variable x.
decision variables x are continuous.
Non-negativities does not count
towards constraint.

Example
A retailer is planning her yearly inventory
strategy for a commodity which sells a
steady rate thorough the year and for which
she estimates a yearly demand of D units.
The storage cost is $S per unit and the cost
of ordering is $C per order. Assuming that
there is no lead time,how many orders and
of which size must be placed to reduce the
total inventory costs?

Model
Suppose the retailer places an order of size x,
Then must place D/x

orders at a cost C*D/x

The average inventory is x/2 therefore the


holding costs are S/2 * x
Optimization problem
Minimize
S/2* x + C*D/x
Non-negativity x 0

Unconstrained Extreme Points


Necessary Condition
Xo is extreme point
f(X0) = 0

if

Unconstrained Extreme Points


Sufficient Condition

Xo is extreme point
Hessian matrix

Positive definite when


X0 is a minimum point

Negative definite when


X0 is a maximum point

Unconstrained Extreme Points


TThe necessary condition for X0 to be an extreme
point of f(X) is that f(X0) = 0
SThe Sufficient condition for a stationary point X0 to
be extreme point is that the Hessian matrix
evaluated at X0 is
(a) Positive definite when X0 is a minimum point
(b) Negative definite when X0 is a maximum point.

Example
Consider the function
f(x1, x2, x3) = x1 + x1x2 + 2x2 + 3x3 x12 2x22 x32
The necessary condition f(X0) = 0 gives
f/x1 = 1 + x2 2x1 = 0
f/x2 = x1 + 2 4x2 = 0
f/x3 = 3 2x3 = 0
The solution is given by x0 = {6/7, 5/7, 3/2}.

...Contd.
For sufficiency, we evaluate
H=

2f/x12 2f/x1x2
2f/x1x3
2f/x2x1
2f/x22
2f/x2x3
2f/x3x1
2f/x3x2
2f/x32
= -2
-1
0

-1
-4
0

0
0
-3

The principal minor determinants of H have values 2, 7 and 21


respectively indicating that {6/7,5/7, 3/2} represents a
maximum point.

Newton-Raphson Method
f(x) = 0 is difficult to solve
Iterative procedure

Initial Point Xo
Next point Xk+1
Xk+1 = Xk f(Xk) / f (Xk)

Newton-Raphson Method
The necessary condition, f(x) = 0, may be difficult to
solve numerically. The Net-Raphson method is an iterative
procedure for solving simultaneous nonlinear equations
The idea of the method is to start from an initial point X 0.
By using the foregoing equations, a new point X k+1 is
determined from Xk. the procedure end with Xm as the
solution when Xm Xm+1
The relationship between Xk and Xk+1 is
Xk+1 = Xk f(Xk) / f (Xk)
Where k =0,1,2,.

Steepest (Ascent/Descent)
Basic paradigm of steepest methods is as follows

Start with an initial point x0


Choose a direction d0 = f(x)
Choose a step size 0
Update the solution x1 = x0 + 0 d0

If stopping criterion is met stop;


Else repeat steps 2-4 with the new point x1

Constrained Problems

Constrained Problems
What are Constrained Problems
KKT Conditions
Separable programming
Quadratic programming

Constrained Problems
Constraints on the decision
Variables.
Constraints gi(x) can be both
linear and nonlinear

Mathematically
Minimize/Maximize f(x);
Subject to
gi(x)=bi;
gi(x)bi;
gi(x)bi;
Where i = 1,, m

Real World Problems

Water Resource Planning


In regional water planning, sources emitting pollutants
might be required to remove waste from the water
system. Let x j be the pounds of Biological Oxygen
Demand (an often-used measure of pollution) to be
removed at source j.
One model might be to minimize total costs to the
region to meet specified pollution standards

Mathematical Form
Minimize
n

fj (xj ),
j=1

Subject to:
n

aij xj bi (i = 1, 2, . . . ,m)
j=1

( j = 1, 2, . . . , n)
and 0 xj uj

. . . where
fj (xj ) = Cost of removing xj pounds of Biological
Oxygen Demand at source j ,
bi = Minimum desired improvement in water quality
at point i in the system,
aij = Quality response, at point i in the water system,
caused by removing one pound of Biological Oxygen
Demand at source j ,
uj = Maximum pounds of Biological Oxygen Demand

Inequality Constrained Problems

The necessary conditions for inequality


constrained problem were first published in
W. Krush's master's thesis, although they
became renowned after a seminal
conference paper by Harold W. Kuhn and
Albert W. Tucker

KRUSH-KUHN-TUCKER Conditions

Known as KKT Conditions

Set of equations/inequalities
solution must satisfy

KRUSH-KUHN-TUCKER Conditions
A system of equations and inequalities
which the solution of a NLP problem must
satisfy when the objective function and the
constraint functions are differentiable
It is a generalization of method of
Lagrange multipliers.

Example
where f(x) is the function to be minimized,

are the inequality constraints

are the equality constraints,


and m and l are the number of inequality
and equality constraints, respectively

KKT Conditions
0
f(X) - g(X) + = 0
i gi(X) = 0
g(X) 0
(The same conditions apply to minimization as
well, with the additional restriction that must
be non-positive.)

Separable Function
A function f(x) is separable if it
can be expresses as the sum
n

f(x) = fi(xi)
i=1

of single variable functions fi(xi).

Mathematically
A constrained nonlinear problem of the form
Min/Max f(x)
Subject to
gi(x)=bi;
gi(x)bi;
gi(x)bi;
Where
i = 1,, m
is separable problem if every gi is separable.

Example
Max f (x)= 20x1+16x2 2x12 x22 (x1 + x2)2
subject to:
x1 + x2 5,
x1 0, x2 0.
As stated, the problem is not separable, because of the
term (x1 + x2)2 in the objective function. Letting
x3 = x1 + x2, though, we can re-express it in separable
form as
Contd...

Max f (x) = 20x1+16x2 2x12 x22x32


The objective function is now written as
f (x) = f1(x1) + f2(x2) + f3(x3),
subject to
x1 + x2 5,
x1 + x2 x3 = 0, x1 0, x2 0, x3 0.
where
f1(x1) = 20x1 2x12
f2(x2) = 16x2 x22
And
f3(x3) = x32

Quadratic Programming

Objective Function Maximization

Constraints - linear

Quadratic Programming
Quadratic programming concerns the
maximization of a quadratic objective
function subject to linear constraints,

Example
Max f (x) = 20x1+16x2 2x12 x22 (x1 + x2)2
subject to:
x1 + x2 5
x1 0, x2 0
Expanding (x1 + x2)2 as x12 + 2x1 x2 + x22 and
incorporating the factor 1/2 , we rewrite the objective
function as:
Contd...

Max f (x) = 20x1+16x2+1/2(6x1x14x2x22x1x22x2x1)

so that
q11
q12
q21
q22

=
=
=
=

6
2
2
4

Quadratic Programming
using WinQSB

Step-1

Quadratic
Programming

WinQSB

Step-2

Problem
Specifications

Step-3

Step-4

Step-5

RUN

Step-6

Step-7

Step-8

Graph

Step-9

Step-10

Step-11

Step-12

Step-13

Step-14

Step-15

Step-16

Step-17

Step-18

Step-19

Step-20

Non-Linear Programming
Using WinQSB

New problem

Solve the problem

minimize

Thank You!

Вам также может понравиться