Professional Documents
Culture Documents
t1 , t2
units of measurement of yt .
s
,
0
s = 0, 1, 2, . . .
1
1.96
T
. If the sample autocorrelation coefficient, s , falls outside this
region for any value of s, then we reject the null hypothesis
that the true value of the coefficient at lag s is zero.
An ACF Example
Question:
Solution:
E(yt ) =
var(yt ) = 0 = 1 + 12 + 22 + + q2 2
Covariances
(
(s + s+1 1 + s+2 2 + + q qs ) 2 for s = 1, . . . , q
s =
0
for s > q
c Chris Brooks 2013
Introductory Econometrics for Finance
10
Example of an MA Problem
1. Consider the following MA(2) process:
yt = ut + 1 ut1 + 2 ut2
where ut is a zero mean white noise process with variance 2 .
i. Calculate the mean and variance of Xt
ii. Derive the autocorrelation function for this process (i.e. express
the autocorrelations, 1 , 2 , ...as functions of the parameters 1
and 2 ).
iii. If 1 = 0.5 and 2 = 0.25, sketch the acf of Xt .
11
Solution
i. If E (ut ) = 0, then E(uti ) = 0 i So
E(yt ) = E(ut + 1 ut1 + 2 ut2 )
= E(ut ) + 1 E(ut1 ) + 2 E(ut2 ) = 0
var(yt ) = E[yt E(yt )][yt E(yt )]
But E(yt ) = 0, so
var(yt ) = E[(yt )(yt )]
var(yt ) = E[(ut + 1 ut1 + 2 ut2 )(ut + 1 ut1 + 2 ut2 )]
2
2
+ cross-products
+ 22 ut2
var(yt ) = E ut2 + 12 ut1
12
Solution (Contd)
So
2
2
var(yt ) = 0 = E ut2 + 12 ut1
+ 22 ut2
= 2 + 12 2 + 22 2
= 1 + 12 + 22 2
13
Solution (Contd)
ii. The acf of yt
1 = E[yt E(yt )][yt1 E(yt1 )]
1 = E[yt ][yt1 ]
1 = E[(ut + 1 ut1 + 2 ut2 )(ut1 + 1 ut2 + 2 ut3 )]
2
2
1 = E 1 ut1
+ 1 2 ut2
1 = 1 2 + 1 2 2
1 = (1 + 1 2 ) 2
14
Solution (Contd)
2 = 2 2
15
Solution (Contd)
16
Solution (Contd)
We have the autocovariances, now calculate the
autocorrelations:
0 =
1 =
2 =
3 =
s
0
=1
0
1
(1 + 1 2 ) 2
(1 + 1 2 )
=
=
2
2
2
0
1 + 1 + 2
1 + 12 + 22
(2 ) 2
2
2
=
=
2
2
2
0
1 + 1 + 2
1 + 12 + 22
3
=0
0
s
=0 s >2
0
17
Solution (Contd)
18
ACF Plot
Thus the acf plot will appear as follows:
1.2
1
0.8
0.6
acf
0.4
0.2
0
0
0.2
0.4
0.6
lag, s
19
Autoregressive Processes
An autoregressive model of order p, an AR(p) can be
expressed as
Li yt = yti
Lyt = yt1
yt = +
p
X
i yti + ut
i =1
or
yt = +
p
X
i Li yt + ut
i =1
or
(L)yt = +ut
where
(L) = (11 L2 L2 p Lp ).
20
MA() representation.
stationary?
21
Wold decomposition is
yt = (L)ut
where,
(L) = (L)1 = (1 1 L 2 L2 p Lp )1
22
mean is given by
E (yt ) =
0
1 1 2 p
2 = 1 1 + 2 + + p2 p
.. .. ..
. . .
p = p1 1 + p2 2 + + p
If the AR model is stationary, the autocorrelation function will
23
Sample AR Problem
yt = + 1 yt1 + ut
24
Solution
i. Unconditional mean:
E(yt )
E( + 1 yt1 )
E(yt )
+ 1 E(yt1 )
But also
So
E(yt )
E(yt )
+ 1 ( + 1 E(yt2 ))
+ 1 + 21 E(yt2 )
+ 1 + 21 ( + 1 E(yt3 ))
+ 1 + 21 + 31 E(yt3 )
25
Solution
(Contd)
E(yt ) = 1 + 1 + 21 + =
1 1
26
Solution
(Contd)
yt
(1 1 L)1 ut
1 + 1 L + 21 L2 + ut
27
Solution
(Contd)
2
2
= E ut2 + 21 ut1
+ 41 ut2
+ + cross-products
2
2
= E ut2 + 21 ut1
+ 41 ut2
+
= u2 + 21 u2 + 41 u2 +
= u2 1 + 21 + 41 +
=
u2
(1 u2 )
28
Solution
(Contd)
1 = E[yt yt1 ]
29
Solution
(Contd)
1 =
1 2
1 21
30
Solution
(Contd)
Using the same rules as applied above for the lag 1 covariance
2 = E[yt yt2 ]
2 = E ut + 1 ut1 + 21 ut2 + ut2 + 1 ut3
+ 21 ut4 +
2
2
2 = E 21 ut2
+ 41 ut3
+ +cross-products
2 = 21 2 + 41 2 +
2 = 21 2 1 + 21 + 41 +
2 =
21 2
1 21
31
Solution
(Contd)
31 2
1 21
s1 2
1 21
32
Solution
(Contd)
1 21
3 = 31
s = s1
33
22 = 2 12
1 12
34
declining.
35
ARMA Processes
By combining the AR(p) and MA(q) models, we can obtain
an ARMA(p,q) model:
(L)yt = + (L)ut
where
(L) = 1 1 L 2 L2 p Lp
and
(L) = 1 + 1 L + 2 L2 + + q Lq
or
yt
with
E(ut ) = 0; E ut2 = 2 ; E (ut us ) = 0, t 6= s
36
E (yt ) =
1 1 2 p
37
38
10
0.05
0.1
0.15
0.2
0.25
0.3
acf
pacf
0.35
0.4
0.45
lag, s
39
acf
pacf
0.3
0.2
0.1
0
1
10
0.1
0.2
0.3
0.4
lag, s
40
acf
pacf
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
10
0.1
lag, s
41
0.5
acf
pacf
0.4
0.3
0.2
0.1
0
1
10
0.1
lag, s
42
0
1
10
0.1
0.2
0.3
acf
pacf
0.4
0.5
0.6
lag, s
43
0.7
0.6
0.5
0.4
0.3
acf
pacf
0.2
0.1
0
1
lag, s
10
44
0.6
acf
pacf
0.4
0.2
0
1
10
0.2
0.4
lag, s
45
Step 1:
Involves determining the order of the model.
Use of graphical procedures
A better procedure is now available
Step 2:
c Chris Brooks 2013
Introductory Econometrics for Finance
46
47
embody 2 factors
48
2k
T
SBIC = ln(
2 ) +
k
ln T
T
HQIC = ln(
2 ) +
2k
ln(ln(T ))
T
49
(Contd)
orders?
50
ARIMA Models
51
Exponential Smoothing
Another modelling and forecasting technique
How much weight do we attach to previous observations?
Expect recent observations to have the most power in helping
St = yt + (1 )St1
(1)
Where
is the smoothing constant, with 0 1,
yt is the current realised value,
St is the current smoothed value.
c Chris Brooks 2013
Introductory Econometrics for Finance
52
Exponential Smoothing
(Contd)
(2)
(3)
St
= yt + (1 )(yt1 + (1 )St2 )
St
= yt + (1 )yt1 + (1 )2 St2
(4)
53
Exponential Smoothing
(Contd)
St
= yt + (1 )yt1 + (1 )2 St2
St =
T
X
i =0
(1 )i yti
+ (1 )T S0
54
Exponential Smoothing
(Contd)
ft,s = St
for all steps into the future s = 1, 2, . . .
This technique is called single (or simple) exponential
smoothing.
55
Exponential Smoothing
(Contd)
56
Forecasting in Econometrics
Forecasting = prediction.
An important test of the adequacy of a model.
e.g.
Forecasting tomorrows return on a particular share
Forecasting the price of a house given its characteristics
Forecasting the riskiness of a portfolio over the next year
Forecasting the volatility of bond returns
We can distinguish two approaches:
Econometric (structural) forecasting
Time series forecasting
The distinction between the two types is somewhat blurred
(e.g, VARs).
57
Jan 1990
Dec 1998
Jan 1999
Dec 1999
58
conditional expectations:
E (yt+1 | t )
We cannot forecast a white noise process:
E(ut+s |t ) = 0 , s > 0
The two simplest forecasting methods
1. Assume no change: f (yt+s ) = yt
2. Forecasts are the long term average f (yt+s ) = y
c Chris Brooks 2013
Introductory Econometrics for Finance
59
e.g.
= X + u
future value:
E(yt ) = 1 + 2 x2 + 3 x3 + + k xk
= y !!
60
Models include:
simple unweighted averages
exponentially weighted averages
ARIMA models
Non-linear models e.g. threshold models, GARCH, bilinear
models, etc.
61
p
X
ai ft,si +
i =1
q
X
bj ut+sj
j=1
where
ft,s
ut+s
= yt+s , s 0
= 0, s > 0
= ut+s , s 0
62
ahead.
63
(Contd)
= E (yt+s|t ) = s 4
64
yt
= + 1 yt1 + 2 yt2 + ut
65
E (yt+2|t ) = + 1 ft,1 + 2 yt
66
etc, so
= + 1 ft,s1 + 2 ft,s2
67
N
1 X
(yt+s ft,s )2
N
t=1
68
N
100 X yt+s ft,s
MAPE =
yt+s
N
t=1
69
where
70
Actual
0.40
0.20
0.10
0.10
0.05
40
71
72
wanted it for?
points
forecast
73
over-confidence
inconsistency
recency
anchoring
illusory patterns
group-think.
74