You are on page 1of 11

Econometrics 1

Lecture 3
BLUE Properties of OLS Estimators

Unbiasedness of Estimates-1
Unbiasedness of estimates

E 1 1

Proofs:

y i xi x y i y

E 1 E y 2 x E i
i
x
2
n
i xi x

E 1 x wi E y i =
n

n x w E
i

2 x i ei

n x wi 1 2 xi 1 1 x wi 2
or E 1

i
i

2 x wi xi
1=
i

i
where wi xi x 2
i

Unbiasedness of Estimates-2
Proof:

x x y y x x y w y w

x
x

2 x i ei

E 2 E wi y i E wi 1 2 xi ei E wi 1 2 wi xi wi ei 2

Variance of Estimator-1
Variance of estimated parameters and the dependent variable

var 1

1
2
N

x i2

var 1 var x wi y i =

xw
1

var 1 x wi var y i 2 2 2 i x 2 w i2 =
n
n

x wi
x2
2 1
2
2
2

var 1 2 2
x w i = var 1
2
n
x
n
n
i

var 2 2
2
xi x

i

Variance-2

var 2 2
2
xi x

i

It was proved above that

E 2 E wi y i E wi 1 2 xi ei E wi 1 2 wi xi wi ei 2

2
1
2
2
var 2 E E 2 2
E wi ei w ii2 E ei 2
2
xi x

cov 1 , 2 E 1 E 1 2 E 2 XE 2 E 2

Where use is made of the fact that

E 1 E 1 Y 2 X Y E 2 X 2 E 2 X

X var 2

How is variance of affected by the variance of ?

varYi 2
Proof:

2
varYi E y i y E 1 2 xi ei 1 2 x

2 xi E ei E 1 E 2 x

A small Shazam program


sample 1 8
read y constant x
4
1
5
6
1
8
7
1
10
8
1
12
11 1
14
15 1
17
18 1
20
22 1
25
stat y
ols y x /pcov=bb
stop

Shazam Output an Example


Results from Shazam
|_sample 1 8
|_read y constant x
3 VARIABLES AND
|_stat y
NAME
Y

N
8

MEAN
11.375

8 OBSERVATIONS STARTING AT OBS


ST. DEV
6.3682

VARIANCE
40.554

MINIMUM
4.0000

1
MAXIMUM
22.000

|_ols y x / pcov
REQUIRED MEMORY IS PAR=
1 CURRENT PAR=
2000
OLS ESTIMATION
8 OBSERVATIONS
DEPENDENT VARIABLE= Y
...NOTE..SAMPLE RANGE SET TO:
1,
8
R-SQUARE =
0.9807
R-SQUARE ADJUSTED =
0.9775
VARIANCE OF THE ESTIMATE-SIGMA**2 = 0.91402
STANDARD ERROR OF THE ESTIMATE-SIGMA = 0.95604
SUM OF SQUARED ERRORS-SSE=
5.4841
MEAN OF DEPENDENT VARIABLE =
11.375
LOG OF THE LIKELIHOOD FUNCTION = -9.84116
VARIABLE
NAME
X
CONSTANT

ESTIMATED STANDARD
T-RATIO
COEFFICIENT
ERROR
6 DF
0.95873
0.5493E-01
17.45
-1.9274
0.8338
-2.312

PARTIAL STANDARDIZED ELASTICITY


P-VALUE CORR. COEFFICIENT AT MEANS
0.000 0.990
0.9903
1.1694
0.060-0.686
0.0000
-0.1694

VARIANCE-COVARIANCE MATRIX OF COEFFICIENTS


X
0.30178E-02
CONSTANT -0.41872E-01 0.69523
X
CONSTANT
|_stop
TYPE COMMAND

Coefficient of determination
Coefficient of determination is a measure in the regression analysis that
shows the explanatory power independent variables (regressors) in
explaining the variation on dependent variable (regressand). The total
variation on the dependent variable can be decomposed as following:
Var y i

y
2

y
i

Var y i y i y ei2
2

y e i

y
i

y
2

2
i

2 y i y e i

For T observations and K explanatory

variables
[Total variation] = [Explained variation] + [Residual variation]
df = T-1

K-1

T-K
8

Coefficient of determination
y

1
y

y
y
2

i
i

2
i

yi y

R2 1 R2

; 0 R2 1

Total variation = explained variation +


unexplained variation
R-SQUARE =

0.9807

R-SQUARE ADJUSTED =

0.9775
9

Comparison of the OLS and Another Alternative


Estimator
Efficiency of parameters in weighted least square
bw can be written as a linear function of random variables
Y1 Y2

Y3
and

1 1 1
Y3
bw a i Yi where linear weights, ai , , , are applied to
Y1 Y2
and
.
2
3
6

i
Therefore bw is a linear estimator.
1
1
1
1
1
1
b) bw is unbiased because E bw 2 E Y1 3 E Y2 6 E Y3 2 3 6
2
c) Given that theY1 Y2 andY3 have N , distribution. The variance bof
w

can be computed as
1
1
7
1
1
1 1
varbw var Y1 var Y2 var Y3 2 2 2 2
9
36
18
2
3
6 4
10

Comparison of the OLS and Another Alternative


Estimator-2

The variance of the OLS estimator is varb

2
3

e) Whether bw will be good estimator if its variance less than that of


b varbw varb . It would be a bad estimator if varbw varb .
f). If

9 varbw
2

7 2
7
1
1
9 3.5 varb 2 9 3 . Therefore
18
18
3
3

Least Square Estimator b has smaller variance than the weighted least square
estimator bw .

11

You might also like