You are on page 1of 14

MULTIPLE LINEAR

REGRESSION
Presented by:
Juanito S. Chan

Introduction
Multiple linear regression model is a linear
model where more than one independent
variable is needed. For the case of k
independent variables, x1, x2, , xk, the mean
of Y|x1, x2,.., xk is given by the multiple linear
regression model

Y x1 , x2 , xk 0 1 x1 k xk

and the estimated response is obtained from the


sample regression equation

y b0 b1 x1 bk xk
where each regression coefficient i is
estimated by bi from the sample data using the
method of least squares.

Polynomial Regression Model


The polynomial regression equation is given by:
2
r
0
1
2
r

y b b x b x b x

The variation of polynomial model is the


nonlinear exponential model which is
represented by the following equation:
x

ab
y

Estimating the Coefficients


Three normal equations are used to estimate
the coefficients b0, b1, and b2.
n

i 1

i 1

i 1

nb0 b1 x1i b2 x2i yi


n

i 1

i 1

i 1

i 1

i 1

i 1

i 1

b0 x1i b1 x12i b2 x1i x2i x1i yi


n

b0 x2i b1 x1i x2i b2 x x2i yi


2
2i

i 1

Using the three normal equations, we can solved


for b1 and b2 by any appropriate method for
handling three linear equations in three
unknowns, such as Cramers rule, and then b 0 is
obtained from the first of the three normal by
observing that

b0 y b1 x1 b2 x2

Sample Problem
An experiment was conducted to determine if
the weight of an animal can be predicted after
a given period of time on the basis of the initial
weight of the animal and the amount of feed
eaten. The following data, measured in
kilograms, were recorded: (in the next slide)
Fit a multiple linear regression equation of the
form Y x , x , x 0 1 x1 k xk and then
predict the final weight of an animal having an
initial weight of 35 kg that is fed 250 kg of
feed.
1

Final weight,
y

Initial weight,
x1

Feed eaten,
x2

95

42

272

77

33

226

80

33

259

100

45

292

97

39

311

70

36

183

50

32

173

80

41

236

92

40

230

84

38

235

Fitting the Quadratic Equation


The quadratic equation is given in the form of:
Y x 0 1 x 2 x 2
b0, b1, and b2 can be solved using the following
normal equations:
n
n
n
nb0 b1 xi b2 xi2 yi
i 1

i 1

i 1

i 1

i 1

i 1

i 1

i 1

i 1

i 1

i 1

b0 xi b1 xi2 b2 xi3 xi yi
b0 xi2 b1 xi3 b2 xi4 xi2 yi

Sample Problem
Given the data
x

9.1

7.3

3.2

4.6

4.8

2.9

5.7

7.1

8.8

10.2

Fit a regression curve of the form


and then estimate Y|2

Y x 0 1 x 2 x 2

General multiple linear regressions


normal equations
n

i 1

i 1

i 1

i 1

nb0 b1 x1i b2 x2i bk xki yi


n

i 1

i 1

i 1

i 1

i 1

i 1

i 1

i 1

i 1

i 1

b0 xi1 b1 x12i b2 x1i x2i bk x1i x2i x1i yi

b0 xki b1 xki x1i b2 xki x2i bk xki2 xki yi

General multiple polynomial


regressions normal equations

i 1
n

2
x
i

i 1

i 1
n

2
i

i 1
n

3
x
i
i 1

x x

i 1

r
i

i 1

r 1
i

2
i

i 1
n

3
i

i 1
n

4
x
i
i 1

x
i 1

r 2
i

r
i

b0
xir 1 b
1
i 1
n
b
xir 2 2

i 1
b
r
n

2r
xi
i 1

i 1
n

y
i 1
n

x y
i 1
n

r
x
i yi

i 1

Confidence Interval for mean


response for Y|x and y0
' 1
0
0

' 1
o
0

y 0 t 2 s x A x Y x10 , x20 , xk 0 y 0 t 2 s x A x

y0 t 2 s 1 x0' A1 x0 Y x10 , x20 , xk 0 y 0 t 2 s 1 xo' A1 x0


where t/2 is a value of the t distribution with n k 1 degrees of
freedom.

Properties of the Least Squares


Estimators
SSE = SST SSR
where

SST SYY y
i 1

2
i

SSR b j g j

j 0

i 1

n
2

i 1

where k no. of independent variables


b coefficients of regression model
g right hand side of the normal equations
SSE
s2
n k 1

You might also like