You are on page 1of 17

Multiple Regression and Other

Extensions of The Simple Linear


Regression Regression Model

Model with two explanatory


variables
The Normal Equation
e.g: theory of demand
Quantity demanded for a given commodity (Y) depends on its price (X 1)
and on consumers income (X2)

The relationship is assumed

On observation, some factors that influence the quantity ommited


from the function. Some of it may be taken into account by
introducing a random variable u, in the function, which becomes
stochastic

Systematic
component

Random
component

The coefficient
is expected to have a negative sign, while
is expected to be positive
To complete the specification of the model, the assumptions
about the random variable u is needed
Assumption 1 (randomness of u)
ui is a random real variable : it may bepositif, negative or zero
Assumption 2 (zero mean of u)
The mean value of u in any particular period is zero
Assumption 3 (homoscedasticity)
The variance of ui is constant in each period
Assumption 4
the variable ui has a normal distribution

Assumption 5 (Nonautocorrelation)
The random term of different observation (ui, uj) are
independent: covariance of any ui with any other uj
are equal to zero
for
Assumption 6
u is independent of explanatory variable(s) : their
covariance is zero
Assumption 6A
The Xis are set of fixed value in the hypothetical
process of repeated sampling which underlies the
linear regression model
Assumption 7 (No errors of measurement in
the Xs)
The explanatory variable(s) are measured without

Assumption 8 (Noperfect multicolinear


Xs)
The explanatory variables are not perfectly
linearly correlated
Assumption 9
The macrovariables should be correctly
aggregated
Assumption 10
The relationship being estimated is
identified
Assumption 11
The relationship is correctly specified

The parameter used on spesified


model should be estimated
Where
,
,
true parameters

are estimates of the


,
,

The estimates will be obtain by


minimising the sum squared residuals

To assume a minimun value, its partial


derivatives with respect to
, ,
be
equal to zero

From its partial differentiation, we get three


normal equations

Other formulae also may be used for obtaining


value for the parameter estimates

These following formulae is formally derived by


solving the system of normal equation

Where,
,

, and

The Concept of Multiple Determination (or the squared


multiple correlation coefficient) R2
The square of the correlation coefficient is called the
coefficient of multiple determination or squared multiple
correlation coefficient.
R2 shows the percentage of the total variation of Y explained
by the regression plane. The higher R2 the greater the
percentage of the variation of Y explained by the regression
olane, that is, the better goodness of fit, the closer R 2 to zero,
the worst the fit

Note:

Therefore , we can calculating the value of R2 by subtituting


the residual

The Mean and Variance of the Parameter


Estimate
,
,
their mean expected value is the true parameter itself
And the variance of the parameter estimates are obtain
by this following formulae

where
, K being the total number of
parameter which
are estimated.
In the three-variable model K=3

Test of Significance of The Parameter Estimates


Test of significance of the parameter estimates is the
standard error test, which is equivalent to the students
test.
Researches test the null hypothesis
for each
parameter, againts the alternative hypothesis
. This type of hypothesis implies two-tail test
1. The standard error test
Standard error
(a)If
we accept the null hypothesis, that is, we
accept that the estimate bi is not statistically
significant at the 5% level of significance
(b)If
we reject the null hypothesis, that is, we
accept that the the parameter estimate is
statistically significant at the 5% level of significance

2. The students test of the null hypothesis


We compute the t ratio for each

By compare with the value of the t-table with


n-K
(a)If t* falls in the acceptance region , that is, if
(with n-K) degrees of freedom, we accept the
null hypothesis

(b) If t* falls in the critical region, we rejct the null


hypothesis and accept alternative one

The general linear regression model


Derivation of The Normal Equation

Generalisation of The Formula for R2

The Adjusted Coefficient of


Determination
or

Generalisation of The Formulae of The


Variance of The Parameter Estimates

Partial Correlation Coefficient


;

Extension of The Linear Regression


Model to Nonlinear Relationship
Nonlinear relationship may be estimated by
fitting nonlinear functions directly to the
original data. The method usually involves
highly complex calculations if the
relationship are nonlinear parameters
e.g:

Transformation of Parabolas and


Other Polynom
Set X2 = Z, X3 = W, etc

Transformation of Nonlinear
Function Involving Constant
Elasticities
the apropriate transformation for the estimation of
the constant elasticity form is to work with the

You might also like