You are on page 1of 11

STAT 410

Fall 2016

Maximum Likelihood Estimation,


Bias, Mean Squared Error

Definition: Maximum Likelihood Estimator (MLE)


p.m.f. or p.d.f.

(; ),

parameter space.

Likelihood function for a sample of i.i.d. 1 , , ,

(; ) = (; 1 , , ) = (1 , , ; ) = ( ; )

where = (1 , ,

is a vector of sample observations.

It is often easier to consider the log-likelihood,

=1

(; ) = ln[(; 1 , , )] = ln[( ; )]
=1

Assumptions (Regularity Conditions):

(R0) The pdfs are distinct; i.e., ( ; ) ( ; ).


(R1) The pdfs have common support for all .

(R2) The true unknown point 0 is an interior point in .


Theorem 6.1.1.

Let 0 be the true parameter. Under assumptions (R0) and (R1),


lim [(0 ; ) > (; )] = 1 0 .

Asymptotically the likelihood function is maximized at the true value 0 .


Let be the maximum likelihood estimate (m.l.e.) of ,
= (; )
1

STAT 410
Fall 2016

Maximum Likelihood Estimation,


Bias, Mean Squared Error

Example 1. Let 1 , , be a random sample of size from the distribution with


probability density function,

1 1
,
0 1
(; ) =
0,
0 < <

a)

Obtain the method of moments estimator of , .

() = (; ) =

b)

1
1
1
1 1
1
=
+1 =
0
1 + 1
1 +

1
1

=
.
1 +

Obtain the maximum likelihood estimator of , .


Likelihood function:

(; ) = ( ; ) =
=1

1


=1

1
1
(; ) = ln +
ln = ln + 1 ln

=1

=1

STAT 410
Fall 2016

Maximum Likelihood Estimation,


Bias, Mean Squared Error

1
(; )
= 2 ln = 0

=1

1
= ln .

=1

Suppose = 3, and 1 = 0.2, 2 = 0.3, 1 = 0.5. Compute the values of


the method of moments estimate and the maximum likelihood estimate for
.

c)

0.2 + 0.3 + 0.5 1


=
3
3

1
1 1 3
=
=
=2
1

1
1
1
= ln = (ln 0.2 + ln 0.3 + ln 0.5) = ln 0.03 1.16885
3
3
3
=1

Def

An estimator is said to be unbiased for if = .

Example 2. Reconsider the prior pdf. Are and unbiased estimators?

a)

Is unbiased for ? That is, does = ?


1

1 1
[ln()] = ln() .

Integration by parts:

= |
3

STAT 410
Fall 2016

Maximum Likelihood Estimation,


Bias, Mean Squared Error
= ln ,
1
=

1 1
,

1
1 1
1 1
1 1
[ln()] = ln() = ln()

0
0
0
1

Therefore,

1
1

1
1

1
1
= [ln( )] = () =

=1

=1

that is, is an unbiased estimator for .


OR
1

() = , 0 < < 1.

Let = ln( ) , = 1, , .

() = ( ) = ( )

= 1 ( ) = 1 , > 0

1 , , . . . ()

Then = . = () = () = , that is, is an unbiased estimator for .


4

STAT 410
Fall 2016

Maximum Likelihood Estimation,


Bias, Mean Squared Error

Is unbiased for ? That is, does = ?

b)

Since () =

= 1, 0 < < 1, is strictly convex, and is not a

constant random variable, by Jensens Inequality ( Theorem 1.10.5 ),

= [()] > [()] = .

is NOT an unbiased estimator for .


Def

For an estimator of , define the Mean Squared Error of by,

2
2
2
= = + = +

unbiased,
large variance

biased,
small variance

Note Chebyshevs inequality implies,


unbiased,
small variance
2

=
2

STAT 410
Fall 2016

Maximum Likelihood Estimation,


Bias, Mean Squared Error

Example 3. What is the Mean Squared Errors for . That is, find .
Let = ln , = 1, , . Then () = , () = 2 .
=

= () =

() 2
=

2
= + =

Def

Let 1 and 2 be two unbiased estimators for . 1 is said to be


more efficient than 2 if 1 < 2 .
The relative efficiency of 1 with respect to 2 is 2 /1 .

Example 4. What is the relative asymptotic efficiency of to .

Recall, from the convergence part 2 notes that the asymptotic variance of is,
2 (1 + )2
(1 + 2)

The asymptotic relative efficiency is,

2 (1 + )2
(1 + )2
(1 + 2)
=
.
2
1 + 2

is asymptotically less efficient than .

STAT 410
Fall 2016

Maximum Likelihood Estimation,


Bias, Mean Squared Error

Example 5. Let > 0 and let 1 , , be a random sample from the distribution
with the probability density function,
2

a)

(; ) = 22 3 ,

Find ( ), > 4.
(

b)

Hint 1:
Hint 2:

= 22 3 ,
0

> 0

Consider = 2 or = 2 .

() = 0 1 , > 0.
= 2 , = 2


2+1

=
= 2 2+1

0
0

= 2 + 2.
2

Obtain a method of moments estimator of , .

1
1
3 1 1
3
3 1 3
1
1
() = 2 + 2 = 2 + 2 = 2 = 2 =
4
4
2
2
2
2
2

3
9
= 1 =
.
4
16()2
OR

2
2
2
( 2 ) = 2 + 2 = 1 (3) = .

1
2
2

2 = 2 = 2 =
.

=1 2
=1

STAT 410
Fall 2016

c)

Maximum Likelihood Estimation,


Bias, Mean Squared Error

Obtain the maximum likelihood estimator of , .

(; ) = 22 3
=1

=1

=1

= 2 2 exp 2 .

(; ) = ln[(; )] = ln(2) + 2 ln()

d)

2
=1

+ 3 ln( ).

2
2
2 = 0 = 2 .
(; ) =

=1
=1

=1

Suppose 1 = 0.6, 2 = 1.1, 3 = 2.7, 4 = 3.3, 5 = 4.5. Compute , 1 ,


and 2 .
Note that,
5

12.2
=
= 2.44, 2 = 40
5
=1

1 =

=1 2

10
= 0.25
40

9
9
=
0.29682
16()2 16(2.44)2
2 =

10
= 0.25
40

STAT 410
Fall 2016

e)

Maximum Likelihood Estimation,


Bias, Mean Squared Error

What is the probability distribution of = 2 ?


= 2 = () =
1

= () =

() = [()]| ()| = 2 ,

> 0.

1
~ = 2, = .

f)

What is the probability distribution of = =1 2 ?


The iid assumption implies,
() = (

=1

= (
=1

2
1
=

1
= 2 = ~ = 2, =

=1

=1

STAT 410
Fall 2016

g)

Maximum Likelihood Estimation,


Bias, Mean Squared Error
1

Let = =1 2 . Find .

Recall,

1
~ = 2, =

1 2 21
1

0 (2)

2 (2 1) 21

211
=
(2) 21
(2

1)
0
=

h)

2 1

Is the maximum likelihood estimator of , , an unbiased estimator of ? If


not, construct an unbiased estimator of based on .
=

2
2

=
2 1

is not an unbiased estimator of .


Consider,

Then,

2 1
2 1
=
= 2 .
2
=1

2 1
=
=

10

STAT 410
Fall 2016

i)

Maximum Likelihood Estimation,


Bias, Mean Squared Error

Show that and 1 are consistent estimators of .


=
1 =

j)

=1 2

=1 2

2
2
=
=
( 2 ) 2/

9
9

=
16()2 16[()]2

3
16
4

= .

2
2
Find = = + .

2 1

1 2 221
1 2
= 2

(2)

2
2 (2 2) 22
221

=
=
(2 1)(2 2)
(2) 22
0 (2 2)
2
2
2
=

=
(2 1)(2 2) (2 1)2 (2 1)2 (2 2)
42 2
=
(2 1)2 (2 2)

(2 + 2)2
2
42 2
+
=
=
(2 1)2 (2 2) (2 1)2 (2 1)(2 2)
11

You might also like