Professional Documents
Culture Documents
Course Leader
Prof. Dr.Sc VuThieu
Prof.VuThieu
1
May 2004
Basic Econometrics
Introduction:
What is
Econometrics?
Prof.VuThieu
2
May 2004
Introduction
What is Econometrics?
Prof.VuThieu
3
May 2004
Introduction
What is Econometrics?
Prof.VuThieu
4
May 2004
Introduction
What is Econometrics?
Prof.VuThieu
5
May 2004
Introduction
What is Econometrics?
Definition 6: A conjunction of
economic theory and actual
measurements, using the theory and
technique of statistical inference as a
bridge pier (By T.Haavelmo, 1944)
Prof.VuThieu
6
May 2004
Economic
Theory
Mathematical
Economics
Econometrics
Economic
Statistics
Prof.VuThieu
Mathematic
Statistics
7
May 2004
Introduction
Why a separate
Economic theory
discipline
?
makes statements
that are mostly qualitative in nature,
while econometrics gives empirical
content to most economic theory
Mathematical economics is to
8
May 2004
Introduction
Why a separate discipline?
9
May 2004
Economic
Theory
Mathematical
Economics
Econometrics
Economic
Statistics
Prof.VuThieu
Mathematic
Statistics
10
May 2004
Introduction
Methodology of
(1) Statement of theory or
Econometrics
hypothesis:
Keynes stated: Consumption
increases as income increases, but
not as much as the increase in
income. It means that The
marginal propensity to consume
(MPC) for a unit change in
income is grater than zero but less
than unit
Prof.VuThieu
11
May 2004
Introduction
Methodology of
(2)
Specification of the
Econometrics
12
May 2004
Introduction
Methodology of
(3)
Specification of the
Econometrics
Y = 1+ 2X + u ; 0 < 2< 1;
Y = consumption expenditure;
X = income;
13
May 2004
Introduction
Methodology of
Econometrics
(4) Obtaining Data
(See Table 1.1, page 6)
Y= Personal consumption
expenditure
X= Gross Domestic Product
all in Billion US Dollars
Prof.VuThieu
14
May 2004
Introduction
Methodology of
(4)
Obtaining
Data
Econometrics
Year
X
Prof.VuThieu
1980
1981
1982
1983
1984
1985
1986
1987
1988
1989
1990
1991
2447.1
2476.9
2503.7
2619.4
2746.1
2865.8
2969.1
3052.2
3162.4
3223.3
3260.4
3240.8
3776.3
3843.1
3760.3
3906.6
4148.5
4279.8
4404.5
4539.9
4718.6
4838.0
4877.5
4821.0
15
May 2004
Introduction
Methodology of
(5) Econometrics
Estimating the Econometric
Model
Introduction
Methodology of
Econometrics
(6)
Hypothesis Testing
17
May 2004
Introduction
Methodology of
(7)Econometrics
Forecasting or Prediction
Prof.VuThieu
18
May 2004
Introduction
Methodology of
Econometrics
(8) Using model for control or
policy purposes
Y=4000= -231.8+0.7194 X X 5882
MPC = 0.72, an income of $5882 Bill
will produce an expenditure of $4000
Bill. By fiscal and monetary policy,
Government can manipulate the
control variable X to get the desired
level of target variable Y
Prof.VuThieu
19
May 2004
Introduction
Methodology of
Econometrics
Figure
1.4: Anatomy of economic
modelling
1) Economic Theory
2) Mathematical Model of Theory
3) Econometric Model of Theory
4) Data
5) Estimation of Econometric Model
6) Hypothesis Testing
7) Forecasting or Prediction
8) Using the Model for control or policy
purposes
Prof.VuThieu
20
May 2004
Economic Theory
Mathematic Model
Econometric Model
Data Collection
Estimation
Hypothesis Testing
Forecasting
Prof.VuThieu
Application
in control or
policy
studies
21
May 2004
Basic Econometrics
Chapter 1:
THE NATURE OF
REGRESSION
ANALYSIS
Prof.VuThieu
22
May 2004
23
May 2004
24
May 2004
Prof.VuThieu
Prof.VuThieu
26
May 2004
Prof.VuThieu
27
May 2004
28
May 2004
29
May 2004
Explained Variable
Predictand
Regressand
Response
Endogenous
Prof.VuThieu
Explanatory
Variable(s)
Independent
Variable(s)
Predictor(s)
Regressor(s)
Stimulus or control
variable(s)
Exogenous(es)
30
May 2004
31
May 2004
32
May 2004
Prof.VuThieu
33
May 2004
Basic Econometrics
Chapter 2:
TWO-VARIABLE
REGRESSION ANALYSIS:
Some basic Ideas
Prof.VuThieu
34
May 2004
population: 60 families
Y=Weekly family consumption expenditure
X=Weekly disposable family income
60 families were divided into 10 groups of
approximately the same income level
(80, 100, 120, 140, 160, 180, 200, 220, 240, 260)
Prof.VuThieu
35
May 2004
Prof.VuThieu
36
May 2004
Weekly
family
consumption
expenditure
Y ($)
65
70
74
80
85
88
--
79
84
90
94
98
---
80
93
95
103
108
113
115
102
107
110
116
118
125
--
110
115
120
130
135
140
--
120
136
140
144
145
---
135
137
140
152
157
160
162
137
145
155
165
175
189
--
150
152
175
178
180
185
191
Total
325 462 445 707 678 750 685 1043 966 1211
Mean
65
Prof.VuThieu
77
89
161 173
37
May 2004
38
May 2004
39
May 2004
Prof.VuThieu
40
May 2004
41
May 2004
Term is a
surrogate for all variables that are
omitted from the model but they
collectively affect Y
Many reasons why not include such
variables into the model as follows:
Prof.VuThieu
42
May 2004
43
May 2004
Prof.VuThieu
Y
X
------------------55 80
88 100
90 120
80 140
118 160
120 180
145 200
135 220
145 240
175 260
--------------------
44
May 2004
Weekly Consumption
Expenditure (Y)
SRF1
SRF2
Prof.VuThieu
45
May 2004
46
May 2004
Prof.VuThieu
47
May 2004
48
May 2004
49
May 2004
50
May 2004
Basic Econometrics
Chapter 3:
TWO-VARIABLE
REGRESSION MODEL:
The problem of Estimation
Prof.VuThieu
51
May 2004
= (Yi- ^1 - ^2X)2
(3.1.2)
Normal Equation and solving it
for ^1 and ^2 = Least-square
Prof.VuThieu
52
May 2004
Prof.VuThieu
53
May 2004
54
May 2004
Prof.VuThieu
55
May 2004
57
May 2004
Prof.VuThieu
58
May 2004
59
May 2004
i + U i or
Yi = Y
Y
Y
Yi - = i - i +
y
U
yi = i + i
(Note:
or
Y=Y
)
2
22 x2 +
yi2 =
i
i ; or
60
May 2004
Prof.VuThieu
RSS
r2 + ------- ;
TSS
RSS
or r2 = 1 - ------TSS
61
May 2004
Prof.VuThieu
62
May 2004
Prof.VuThieu
63
May 2004
Basic Econometrics
Chapter 4:
THE NORMALITY
ASSUMPTION:
Classical Normal Linear
Regression Model
(CNLRM)
Prof.VuThieu
64
May 2004
65
May 2004
(1)
(2)
Prof.VuThieu
(3)
(4)
Prof.VuThieu
Prof.VuThieu
68
May 2004
69
May 2004
70
May 2004
Prof.VuThieu
71
May 2004
72
May 2004
Basic Econometrics
Chapter 5:
TWO-VARIABLE
REGRESSION:
Interval Estimation
and Hypothesis Testing
Prof.VuThieu
73
May 2004
key concepts
such as probability, probability
distributions, Type I Error, Type II
Error,level of significance, power of a
statistic test, and confidence interval
Prof.VuThieu
74
May 2004
Pr (^2 - 2 ^2 + ) = 1 -
Random
(5.2.1)
interval ^2 - 2 ^2 +
^2 +
Prof.VuThieu
- ) is confidence coefficient,
Equation
(^2
Prof.VuThieu
- , ^2 + ) is random interval
76
May 2004
77
May 2004
(5.3.1)
t=
=>
Interval for 2
Pr [ -t /2 t t /2] = 1-
Prof.VuThieu
(5.3.2)
(5.3.3)
78
May 2004
Interval for 1
(5.3.5)
79
May 2004
Prof.VuThieu
80
May 2004
Prof.VuThieu
81
May 2004
82
May 2004
83
May 2004
Type of
Hypothesis
H0
H1
Reject H0
if
Two-tail
2 = 2*
2 # 2*
Right-tail
2 2*
2 > 2*
t > t,df
Left-tail
2 2*
2 < 2*
t < - t,df
Prof.VuThieu
84
May 2004
^2
2 = (n-2) ------- ~ 2 (n-2)
2
(5.4.1)
Prof.VuThieu
85
May 2004
2 > 20
2 = 20
2 < 20
2 = 20
2 # 20
Prof.VuThieu
86
May 2004
87
May 2004
88
May 2004
= 2^2 xi2/ ^2
(5.9.1)
If ui are normally distributed; H 0: 2 = 0
then F follows the F distribution with 1
and n-2 degree of freedom
Prof.VuThieu
89
May 2004
Prof.VuThieu
90
May 2004
Source of
Variation
Degree of
Freedom (Df)
ESS (due to
regression)
RSS (due to
residuals)
u^i2
n-2
TSS
y i2
n-1
Prof.VuThieu
Mean sum of
square ( MSS)
2^2 xi2
u^i2 /(n-2)= ^2
91
May 2004
Prof.VuThieu
92
May 2004
Prof.VuThieu
93
May 2004
An illustration:
(5.1.1)
r2= 0.9621
df= 8
F1,2=2202.87
94
May 2004
(5.12.1)
Decision rule:
Calculated- 2N-1-k > Critical- 2N-1-k then H0 can
be rejected
Prof.VuThieu
96
May 2004
97
May 2004
5-12. (Continued)
Under the null hypothesis H0 that the
residuals are normally distributed Jarque
and Bera show that in large sample
(asymptotically) the JB statistic given in
(5.12.12) follows the Chi-Square
distribution with 2 df. If the p-value of
the computed Chi-Square statistic in an
application is sufficiently low, one can
reject the hypothesis that the residuals
are normally distributed. But if p-value is
reasonable high, one does not reject the
normality assumption.
Prof.VuThieu
98
May 2004
99
May 2004
100
May 2004
Prof.VuThieu
May 2004
Prof.VuThieu
May 2004
103
May 2004
Basic Econometrics
Chapter 6
EXTENSIONS OF THE
TWO-VARIABLE LINEAR
REGRESSION MODEL
Prof.VuThieu
105
May 2004
Chapter 6
EXTENSIONS OF THE TWO-VARIABLE LINEAR
REGRESSION MODELS
Prof.VuThieu
106
May 2004
Chapter 6
EXTENSIONS OF THE TWO-VARIABLE LINEAR
REGRESSION MODELS
^2 = XiYi/X2i
^2 = xiyi/x2i
var(^2) = 2/X2i
var(^2) = 2/x2i
^2 = u^i)2/(n-1)
Prof.VuThieu
^2 = u^ )2/(n-2)
(6.1.6)
(3.1.6)
(6.1.7)
(3.3.1)
(6.1.8)
(3.3.5)
O
I
O
I
O
I
107
May 2004
Chapter 6
EXTENSIONS OF THE TWO-VARIABLE LINEAR
REGRESSION MODELS
108
May 2004
Chapter 6
EXTENSIONS OF THE TWO-VARIABLE
LINEAR REGRESSION MODELS
(page 159)
Prof.VuThieu
109
May 2004
Chapter 6
EXTENSIONS OF THE TWO-VARIABLE LINEAR
REGRESSION MODELS
6-2. Scaling and units of measurement
Let Yi = ^1 + ^2Xi + u^ i
(6.2.1)
Define Y*i=w 1 Y i and X*i=w 2 X i then:
^2 = (w1/w2)^2
(6.2.15)
^1 = w1^1
(6.2.16)
*^2 = w12^2
(6.2.17)
Var(^1) = w21 Var(^1)
(6.2.18)
Var(^2) = (w1/w2)2 Var(^2)
(6.2.19)
110
2
2
rProf.VuThieu
=
r
(6.2.20)
May 2004
xy
x*y*
Chapter 6
EXTENSIONS OF THE TWO-VARIABLE
LINEAR REGRESSION MODELS
111
May 2004
Prof.VuThieu
112
May 2004
(6.4.1)
Prof.VuThieu
114
May 2004
Prof.VuThieu
115
May 2004
Prof.VuThieu
May 2004
(6.5.11)
Prof.VuThieu
117
May 2004
Prof.VuThieu
118
May 2004
Equation
Slope =
dY/dX
Elasticity =
(dY/dX).(X/Y)
Linear
Y = X
(X/Y) */
Log-linear
(log-log)
Log-lin
lnY = X
X */
Lin-log
Y = lnX
2(1/X)
Y) */
Reciprocal
Y = X) - 2(1/X2)
Prof.VuThieu
- XY) */
119
May 2004
(pages 179-180)
Prof.VuThieu
120
May 2004
Basic Econometrics
Chapter 7
MULTIPLE REGRESSION
ANALYSIS:
Prof.VuThieu
121
May 2004
Prof.VuThieu
122
May 2004
Prof.VuThieu
123
May 2004
Prof.VuThieu
1.
2.
3.
4.
Prof.VuThieu
Prof.VuThieu
126
May 2004
Prof.VuThieu
127
May 2004
Prof.VuThieu
128
May 2004
Prof.VuThieu
129
May 2004
Prof.VuThieu
130
May 2004
Prof.VuThieu
Prof.VuThieu
132
May 2004
Yi = 1X22i X33ieUi
(7.10.1)
133
May 2004
Yi = 0 + 1 Xi + 2 X2i ++ k Xki + Ui
(7.11.3)
-------------------------------------------------------------
Prof.VuThieu
134
May 2004
Basic Econometrics
Chapter 8
MULTIPLE REGRESSION
ANALYSIS:
Prof.VuThieu
135
May 2004
Chapter 8
MULTIPLE REGRESSION ANALYSIS:
The Problem of Inference
8-3. Hypothesis testing in multiple regression:
Testing hypotheses about an individual partial regression
coefficient
Testing the overall significance of the estimated multiple
regression model, that is, finding out if all the partial slope
coefficients are simultaneously equal to zero
Testing that two or more coefficients are equal to one
another
Testing that the partial regression coefficients satisfy
certain restrictions
Testing the stability of the estimated regression model
over time or in different cross-sectional units
Testing the functional form of regression models
Prof.VuThieu
136
May 2004
Chapter 8
MULTIPLE REGRESSION ANALYSIS:
The Problem of Inference
8-4. Hypothesis testing about individual partial
regression coefficients
With the assumption that u i ~ N(0, 2) we can
use t-test to test a hypothesis about any
individual partial regression coefficient.
H0: 2 = 0
H1: 2 0
If the computed t value > critical t value at the
chosen level of significance, we may reject the
null hypothesis; otherwise, we may not reject it
Prof.VuThieu
137
May 2004
Chapter 8
MULTIPLE REGRESSION ANALYSIS:
The Problem of Inference
8-5. Testing the overall significance of a multiple
regression: The F-Test
Chapter 8
MULTIPLE REGRESSION ANALYSIS:
The Problem of Inference
8-5. Testing the overall significance of a multiple
regression
Alternatively, if the p-value of F obtained from
(8.5.7) is sufficiently low, one can reject H0
An important relationship between R2 and F:
F=(ESS/(k-1))/(RSS/(n-k)) or
R2/(k-1)
F = ---------------(1-R2) / (n-k)
( see prove on page 249)
Prof.VuThieu
(8.5.1)
139
May 2004
Chapter 8
MULTIPLE REGRESSION ANALYSIS:
The Problem of Inference
8-5. Testing the overall significance of a multiple
regression in terms of R2
For Yi = 1 + 2X2i + 3X3i + ........+ kXki + ui
To test the hypothesis H0: 2 = 3 = .....= k = 0
(all slope coefficients are simultaneously zero)
versus H1: Not at all slope coefficients are
simultaneously zero, compute
F = [R2/(k-1)] / [(1-R2) / (n-k)] (8.5.13) (k = total
number of parameters to be estimated including
intercept)
140
If F > F critical = F (k-1,n-k), reject H0
Prof.VuThieu
May 2004
Chapter 8
MULTIPLE REGRESSION ANALYSIS:
The Problem of Inference
8-5. Testing the overall significance of a multiple
regression
Alternatively, if the p-value of F obtained from
(8.5.13) is sufficiently low, one can reject H0
The Incremental or Marginal
contribution of an explanatory
variable:
Let X is the new (additional) term
in the right hand of a regression.
Under the usual assumption of the
normality of ui and the HO: = 0, it 141
Prof.VuThieu
can
be shown that the following F May 2004
Chapter 8
MULTIPLE REGRESSION ANALYSIS:
The Problem of Inference
8-5. Testing the overall significance of a multiple
regression
[R2new - R2old] / Df1
F com = ---------------------(8.5.18)
[1 - R2new] / Df2
Where Df1 = number of new regressors
Df2 = n number of parameters in the
new model
R2new is standing for coefficient of determination of the
142
May 2004
Chapter 8
MULTIPLE REGRESSION ANALYSIS:
The Problem of Inference
8-5. Testing the overall significance of a multiple
regression
Decision Rule:
If F com > F Df1 , Df2 one can reject the Ho that = 0
and conclude that the addition of X to the model
significantly increases ESS and hence the R2 value
When to Add a New Variable? If |t| of coefficient
of X > 1 (or F= t 2 of that variable exceeds 1)
When to Add a Group of Variables? If adding a
group of variables to the model will give F value
greater than 1;
Prof.VuThieu
143
May 2004
Chapter 8
MULTIPLE REGRESSION ANALYSIS:
The Problem of Inference
8-6. Testing the equality of two regression coefficients
Yi = 1 + 2X2i + 3X3i + 4X4i + ui
(8.6.1)
Test the hypotheses:
H0: 3 = 4 or 3 - 4 = 0
(8.6.2)
H1: 3 4 or 3 - 4 0
Under the classical assumption it can be
shown:
t = [(^3 - ^4) ( 3 - 4)] / se(^3 - ^4)
follows the t distribution with (n-4) df
because (8.6.1) is a four-variable model or,
more generally, with (n-k) df. where k is the
144
total
number
of
parameters
estimated,
Prof.VuThieu
May 2004
Chapter 8
MULTIPLE REGRESSION ANALYSIS:
The Problem of Inference
t = (^3 - ^4) / [var((^3) + var( ^4) 2cov(^3, ^4)]
(8.6.5)
Steps for testing:
1. Estimate ^3 and ^4
2. Compute se(^3 - ^4) through (8.6.4)
3. Obtain t- ratio from (8.6.5) with H0: 3 = 4
4. If t-computed > t-critical at designated level of
significance for given df, then reject H0. Otherwise do
not reject it. Alternatively, if the p-value of t statistic
from (8.6.5) is reasonable low, one can reject H0.
Example 8.2: The cubic cost function revisited
Prof.VuThieu
145
May 2004
Chapter 8
MULTIPLE REGRESSION ANALYSIS:
The Problem of Inference
8-7. Restricted least square: Testing
linear equality restrictions
Yi = 1X 22i X 33i eui (7.10.1) and (8.7.1)
Y = output
X2 = labor input
X3 = capital input
In the log-form:
lnYi = 0 + 2lnX2i + 3lnX3i + ui
(8.7.2)
with the constant return to scale: 2 + 3 = 1
(8.7.3)
Prof.VuThieu
146
May 2004
Chapter 8
MULTIPLE REGRESSION ANALYSIS:
The Problem of Inference
8-7. Restricted least square: Testing
linear equality restrictions
How to test (8.7.3)
The t Test approach (unrestricted): test of the
147
May 2004
Chapter 8
MULTIPLE REGRESSION ANALYSIS:
The Problem of Inference
8-7. Restricted least square: Testing linear equality
restrictions
u^2UR=RSSUR of unrestricted regression (8.7.2)
and u^2R = RSSR of restricted regression (8.7.7),
m = number of linear restrictions,
k = number of parameters in the unrestricted regression,
n = number of observations.
R2UR and R2R are R2 values obtained from unrestricted and
restricted regressions respectively. Then
F=[(RSSR RSSUR)/m]/[RSSUR/(n-k)] =
= [(R2UR R2R) / m] / [1 R2UR / (n-k)]
(8.7.10)
follows F distribution with m, (n-k) df.
148
Prof.VuThieu
Decision
rule: If F > F m, n-k , reject H0: 2 + 3 = 1
May 2004
Chapter 8
MULTIPLE REGRESSION ANALYSIS:
The Problem of Inference
8-7. Restricted least square: Testing linear equality
restrictions
and
u^2UR u^2R
(8.7.12)
Example 8.3: The Cobb-Douglas Production
function for Taiwanese Agricultural Sector,
1958-1972. (pages 259-260). Data in Table 7.3
(page 216)
General F Testing (page 260)
Example 8.4: The demand for chicken in the US,
1960-1982. Data in exercise 7.23 (page 228)
Prof.VuThieu
149
May 2004
Chapter 8
MULTIPLE REGRESSION ANALYSIS:
The Problem of Inference
8-8. Comparing two regressions: Testing for structural
stability of regression models
Table 8.8: Personal savings and income data, UK, 19461963 (millions of pounds)
Savings function:
Reconstruction period:
Y t = 1+ 2X t + U1t (t = 1,2,...,n1)
Post-Reconstruction period:
Y t = 1 + 2X t + U2t (t = 1,2,...,n2)
Where Y is personal savings, X is personal income, the
us are disturbance terms in the two equations and n1, n2
150
are the number of observations in the two period
Prof.VuThieu
May 2004
Chapter 8
MULTIPLE REGRESSION ANALYSIS:
The Problem of Inference
8-8. Comparing two regressions: Testing for structural
stability of regression models
+ The structural change may mean that the two
intercept are different, or the two slopes are different,
or both are different, or any other suitable combination
of the parameters. If there is no structural change we
can combine all the n1, n2 and just estimate one savings
function as:
Y t = 1 + 2X t + Ut (t = 1,2,...,n1, 1,....n2). (8.8.3)
How do we find out whether there is a structural
change in the savings-income relationship between the
two period? A popular test is Chow-Test, it is simply
the F Test discussed earlier
Prof.VuThieu
151
May 2004
Chapter 8
MULTIPLE REGRESSION ANALYSIS:
The Problem of Inference
8-8. Comparing two regressions: Testing for structural
stability of regression models
+ The assumptions underlying the Chow
test
u1t and u2t ~ N(0,s2), two error terms are
normally distributed with the same
variance
u1t and u2t are independently distributed
Step 1: Estimate (8.8.3), get RSS, say, S1
with df = (n1+n2 k); k is number of
parameters estimated )
Step 2: Estimate (8.8.1) and (8.8.2)
152
Prof.VuThieu
May 2004
individually and get their RSS, say, S2 and
Chapter 8
MULTIPLE REGRESSION ANALYSIS:
The Problem of Inference
8-8. Comparing two regressions: Testing for structural
stability of regression models
Step 4: Given the assumptions of the
Chow Test, it can be show that
F = [S5 / k] / [S4 / (n1+n2 2k)]
(8.8.4)
follows the F distribution with Df = (k,
n1+n2 2k)
Decision Rule: If F computed by (8.8.4) >
F- critical at the chosen level of
significance a => reject the hypothesis
that the regression (8.8.1) and (8.8.2)
153
Prof.VuThieu
May 2004
are the same, or reject the hypothesis of
Chapter 8
MULTIPLE REGRESSION ANALYSIS:
The Problem of Inference
8-9. Testing the functional form
of regression:
Choosing between linear and log-linear
regression models: MWD Test
(MacKinnon, White and Davidson)
H0: Linear Model Y is a linear function of
regressors, the Xs;
H1: Log-linear Model Y is a linear function
of logs of regressors, the lnXs;
154
Prof.VuThieu
May 2004
Chapter 8
MULTIPLE REGRESSION ANALYSIS:
The Problem of Inference
8-9. Testing the functional form of regression:
Step 1: Estimate the linear model and
obtain the estimated Y values. Call them
Yf (i.e.,Y^). Take lnYf.
Step 2: Estimate the log-linear model and
obtain the estimated lnY values, call
them lnf (i.e., ln^Y )
Step 3: Obtain Z1 = (lnYf lnf)
Step 4: Regress Y on Xs and Z1. Reject H0
if the coefficient of Z1 is statistically
155
significant,
by
the
usual
t
test
Prof.VuThieu
May 2004
Chapter 8
MULTIPLE REGRESSION ANALYSIS:
The Problem of Inference
Example 8.5: The demand for Roses
(page 266-267). Data in exercise 7.20
(page 225)
8-10. Prediction with multiple regression
Follow the section 5-10 and the
illustration in pages 267-268 by using
data set in the Table 8.1 (page 241)
8-11. The troika of hypothesis tests: The
likelihood ratio (LR), Wald (W) and
Lagarange Multiplier (LM) Tests
Prof.VuThieu
156
May 2004