Professional Documents
Culture Documents
Overview
• Last time:
– Further details on min/max decomposition
– Benders decomposition
• Today:
– Intro to unconstrained optimization
• Reading: Bertsekas p. 2-15 (minus
“Sensitivity,” p. 12-13).
• Upcoming:
– Quiz on Practice Problem Sets 1 & 2 tomorrow.
– Lab tomorrow (2nd part of class).
1
1/22/2013
Unconstrained Optimization
2
1/22/2013
• Sufficient conditions:
If A is a sufficient condition for B, then:
– If A is true, B is definitely true.
– If A is not true, B may or may not be true.
Optimality Conditions
• We will learn various necessary and sufficient
conditions for optimality of solutions.
• Ex: consider minimizing f(x) = x2:
3
1/22/2013
Optimality Conditions
• Suppose x* is a local minimum of f(x), i.e.,
f(x) > f(x*) for x “close” to x*:
4
1/22/2013
Optimality Condition
• We have derived a condition: If x* is an
unconstrained local minimum of f(x), then f ( x*) 0.
• Does this give us a necessary condition for local
optimality? A sufficient condition?
5
1/22/2013
Second-Order Conditions
• Can we use second-order information
(i.e., second derivatives and Hessians) to obtain
another condition?
6
1/22/2013
Example
• Example: consider minimizing f(x)=x4.
Does x*=0 satisfy the first and second order
necessary conditions?
Example
• Example: consider minimizing f(x)=(x1-1) 2-(x2-2)2.
Does x*=[1 2]’ satisfy the first and second order
necessary conditions?
7
1/22/2013
8
1/22/2013
Example
• Consider f(x) = x2. What can we say about x* = 0?
Example
• Consider f(x) = (x1-3)2 + (x2+1)2.
What can we say about x* = [3 -1]’?
9
1/22/2013
Example
• Consider f(x) = x3. What can we say about x* = 0?
Example
• Consider f(x) = x4. What can we say about x* = 0?
10
1/22/2013
Convex Functions
• A function f(x) is convex if:
11
1/22/2013
Concave Functions
• A function f(x) is concave if:
Convex Sets
• A set S is convex if
x (1 ) x S x , x S ; [0,1]
24
12
1/22/2013
Convexity–Preserving Operations
• Some operations preserve convexity, for example:
• Nonnegative weighted sum
• Pointwise maximum
13
1/22/2013
28
14
1/22/2013
Gradient Methods
• To solve a minimization problem:
1. Start at some initial solution (a guess).
2. Find a feasible descent direction.
• I.e., a direction in which we can move and still
stay within the feasible region, and in which
our objective value improves.
3. Move some distance in that direction; arrive at
new solution.
4. Evaluate new solution; stop if it is “satisfactory.”
Otherwise, go back to Step 2.
29
15
1/22/2013
• If f is convex,
every local minimum is a global minimum.
• If f is convex and x* is a stationary point,
then x* is a global minimum.
• If f is strictly convex,
then there exists at most one global minimum.
f ( x (1 ) x) f ( x ) (1 ) f ( x) x x, [0,1]
16