You are on page 1of 5

AP 613: Econometrics-I LECTURE 2 MATRIX APPR AC!

T Introduction General form of the multiple linear regression model is:


y i = f ( x1i , x 2i ,..., x ki ) + u i
y i = 1 x1i + 2 x 2i + ... + k x ki + u i

Dr Mudassir Uddin MULTIPLE RE"RE##I $ A$AL%#I#

i = 1,,n

This can be expressed as y i = k x ki + u i


k =1 K

in summation form. Or
' = X& + u

in matrix form, where


y1 x11 x1k 1 u1 ' = , X = , & = , u = yn x n1 x nk k u n

(1 is a column of ones1, i.e. [ x11

x n1 ] = [1 1]
T

% is the dependent ariable (response ariable) and is a random ector X is a matrix of data for the explanator! ariables (a"a independent ariables, regressors, co ariates) is the parameter ector u is a random ector

Assum)tions
1

Therefore ( T 2 represents the transpose of the second row of # and so on. 1

$1.

%inearit!

%inearit! of the parameters (and disturbance) which enters the model. &ometimes models which appear to be non'linear can be estimated using the least's(uares procedure. )or example
y = exp 1 x k k exp u
k =2 K

is a non'linear function. *owe er, ta"ing a logarithmic transformation !ields ln( y ) = 1 + k ln( x k ) +u
k =2 K

$2.

+(u) , -

The disturbance term has a .ero mean u1 - u - E (u) = E 2 = u n - $/. 0onstochastic 1egressors X is a non-stoc*astic n x " matrix. That is, it consists of a set of fixed numbers. 2n general, this assumption can be relaxed somewhat b! assuming that ( is fixed in repeated samples. $3. &pherical 4isturbances

Var (u) = E (uu T ) = 2 I

E (u12 ) E (u1u 2 ) 2 E (u 2 u1 ) E (u 2 ) T E (uu ) = E (u n u1 ) E (u n u 2 )

E (u1u n ) 2 E (u 2 u n ) - 2 = 2 E (u n )

- - , 2I 2

Therefore the re(uirement for spherical disturbances is


2 2 i) Var (u i ) E (u i ) =

i =1,..., n

and
2

ii) Cov(u i , u j ) E (u i , u j ) = i) ii)

i j

is the assumption of *omos+edasticit' (e(ual ariance) is the assumption of no seria, corre,ation (no autocorrelation)

$5.

2dentifiabilit!

6e assume there is no exact linear relationships among the ariables (no perfect mu,tico,,inearit'). &pecificall!, X is n x 7 with ran" 7. (X has -u,, ran+) 2n words, the columns of X are ,inear,' inde)endent. 2mplicit within this assumption are the re(uirements of more obser ations than ariables (micronumerosit') and sufficient ariabilit! in the alues of the regressors. $8. 0ormalit!

The final assumption we introduce, which is useful for the purposes of statistical inference but not necessar! for anal!sing the properties of the estimators, is the assumption that the disturbances are normall! distributed.
u 9 N (-, 2 I )

Least #.uares Estimation &ample counterpart of the "' ariable regression model is
' = X/ + e

where b and e are the sample counterparts of and u respecti el!. Our aim is to minimise the difference between an obser ed alue of !i and its predicted alue. That is we want the error to be the least.
: = X/ '

: e = ' '

&pecificall!, the problem is to find an estimator that minimises the error sum of s(uares:

: ) T (' ' :) e T e = (' ' = ( ' X/0T 1' X/0 = ' T ' + / T X T X/ ' T X/ / T X T ' = ' T ' + / T X T X/ 2/ T X T ' = ' T ' + / T X T X/ 2' T X/

$ necessar! condition for a minimum is that the first'order conditions e(ual .ero.
e T e = 2 X T X/ 2 X T ' = /

Therefore, rearrange to gi e
1X T X0/ = X T ' / = 1X T X0 1 X T '

(normal e(uation)

XTX is in ertible pro ided X has full ran". (2f X does not ha e full ran" then the determinant of XTX is .ero and so cannot be in erted). )or this solution to be a minimum we re(uire the 2nd deri ati e to be positi e
2 e T e = 2X T X /

#tatistica, Pro)erties o- t*e Least #.uares Estimators ()inite &amples) ;ean of /


/ = 1X T X0 1 X T ' 1X T X0 1 X T 1X& +u0 & +1X T X0 1 X T u

E (/) = E [& ] + E 1X T X0 1 X T u & +1X T X0 1 X T E (u) &

<"e! assumptions: +(u) , - and non'stochastic regressors=

>ariance of /
Var (/) = E (/ &)(/ &) T

]
3

0oting that / & = 1X T X0 1 X T u , then


Var (/) = E 1X T X0 1 X T u1X T X0 1 Xu T ( X X)
T T 1 1

X E (uu ) X( X X) X T ( 2 I ) X( X T X) 1

( X X)

2 ( X T X) 1

<"e! assumptions: Var (u i ) E (u i2 ) = 2 and Cov (u i , u j ) E (u i , u j ) = - for all i, i?.

You might also like