Professional Documents
Culture Documents
Viewed as a function of , this quantity is called the likelihood of . It is often more convenient
to work with the log-likelihood,
n
X
log f (Xi ; )
i=1
n
X
log f (Xi ; )
i=1
where arg max denotes the set of all values achieving the maximum. If there is a unique
maximizer, it is called the maximum likelihood estimate.
iid
(a) Let X1 , . . . , Xn Poi(), a Poisson random variable with intensity parameter . Determine the maximum likelihood estimator of . Use the properties from the notes on
unconstrained optimization to verify that the estimate you obtain is the MLE.
3. Unconstrained Optimization (5 point each)
In this problem you will prove some of properties of unconstrained optimiziation problems
discussed in class. It will be helpful to understand the proofs presented in the notes.
(a) Show that if f is strictly convex, then f has at most one global minimizer.
For the next two parts, the following fact will be helpul. A twice continuously differentiable
function admits the quadratic expansion
1
x y, 2 f y + t(x y) (x y) .
f (x) = f (y) + f (y), x y +
2
for some t (0, 1).
(b) Show that if f is twice-continuously differentiable and x is a local minimizer, then
2 f (x ) 0, i.e., the Hessian of f is positive semi-definite at the local minimizer x .
(c) Show that if f is twice continuously differentiable, then f is convex if and only if the
Hessian 2 f (x) is positive semi-definite for all x Rd .
(d) Consider the function f (x) = 21 xT Ax + bT x + c, where A is a symmetric d d matrix.
Derive the Hessian of f . Under what conditions on A is f convex? Strictly convex?
(e) Optional: In class, we assumed the domain of the objective function was Rd . Suppose
the domain is instead some subset S Rd , and we seek to solve
min f (x).
xS
Do the various properties still hold, or do they need to be modified in some way? What
conditions on S are needed for the (modified) properties to hold?