Вы находитесь на странице: 1из 1

Exam Optimisation

April 2018

Duration : 3h - The documents of the course are authorized - One will verify
cautiously the hypotheses every time one uses a result of the course.

Exercise 1.
We are dealing with the following problem on R3 :
(P) min x2 + 2y 2 + 3z 2 − 4x − 4y.
x+y+z ≤3
1. Is the set K = {(x, y, z)/x + y + z ≤ 3} closed in R3 ? open ? convex ? (explain in
a few words for each question), how would you call this set ?
2. Is the function f (x, y, z) = x2 + 2y 2 + 3z 2 − 4x − 4y convex ? strictly convex ?
α-convex ? (if yes, give the value of α).
3. Justify the existence and unicity of the solution of (P).
4. Apply the Kuhn-Tucker theorem : are the hypotheses verified ? write the Kuhn-
Tucker relations.
5. Solve (P) and give the point where the minimum is reached as well as the minimal
value of the function f on K.

Exercise 2.
One considers on R2 the following minimisation problem : min f (x, y) under the constraint
g(x, y) = 0, with f (x, y) = x3 + y 2 and g(x, y) = x2 + y 2 − 9.
1. Study the regularity of the constraint.
2. Solve the problem with the help of a Lagrange multiplier : verify the hypotheses
of the theorem, write the equations for the optimum point and the Lagrange
multiplier.
3. Determine then the candidate points for being the minimum and deduce the
solution of the problem (by giving the point where the minimum is reached and
the minimal value of the cost-function).

Exercise 3.
One considers the minimisation on R2 (without constraints) of the function :

g(x, y) = x2 + 2x + 2y 2 − 4y + 3.

1. Prove that the problem has a unique solution (existence and unicity).
2. Find the minimum X ∗ = (x∗ , y ∗ ) of g (one will justify the characterization of the
minimum with the help of propositions of the course), and the minimal value of
g.
1
3. Apply the fixed step gradient algorithm : one chooses the step ρ = , and the
2
first guess X0 = (0, 0). Calculate the iterates X1 ,X2 and X3 . Are the hypotheses
of the convergence theorem of the fixed step algorithm satisfied ? Conclude.
4. Apply the optimal step algorithm : one chooses the initial guess X0 = (0, 0). Give
in detail the first iteration of this algorithm : which optimisation problem should
solve the optimal step ? determine this first optimal step and then the value of
the point X1 .
5. Let us now apply the conjugate gradient algorithm. The initial guess is X0 =
(0, 0). What is X2 ?

Вам также может понравиться