Professional Documents
Culture Documents
ECON2209
Slides 05
BF-05
Lecture Plan
Big picture:
yt = mt + st + xt ,
st + p = st ,
s
k =1
t +k
= 0,
E( xt ) = 0
Stochastic processes
Strictly stationarity and covariance stationarity
Autocorrelation (ACF) and partial ACF
White noise and its ACF sampling distribution
Wolds theorem
Conditional expectations
BF-05
A random variable
for fixed t (eg. t= 30).
-2
SamplePathsofaStochasticProcessyt
10
20
30
40
50
BF-05
Main issues
Understand the characteristics of SP via a time
series, which is a realisation of SP.
In particular, find out SPs dependence structure.
BF-05
Expectation operations E
Many characteristics of SP are given by the expected
values.
Here are some useful rules of mathematical expectations.
Let X and Y be random variables and a be a constant.
BF-05
BF-05
1
0
-1
-3
-2
Yt
10
Time
BF-05
75
80
85
90
95
00
05
Cycle
.12
.08
.04
.00
-.04
-.08
-.12
82
BF-05
70
84
86
88
90
92
94
96
98
10
Auto-covariance
Covariance of x and y:
Cov( x, y ) = E[( x x )( y y )],
x = E( x),
y = E( y ).
11
Cov( yt , yt ) = ( ), = 0, 1, 2, 3, ... ,
12
Sample autocovariance
1 T
= yt ,
T t =1
1 T
( ) = ( yt )( yt ), = 0, 1, 2, 3, ... ,
T t = +1
.2
.15
.10
.1
.05
.0
.00
-.1
-.05
-.10
-.2
-.15
-.3
-.20
-.4
-.25
82
BF-05
84
86
88
90
92
94
96
98
00
02
04
10 11 12
13
Cov( yt , yt )
( )
=
,
Var ( yt )Var ( yt ) (0)
= 0,1,2,3,....
( )
( ) =
,
(0)
= 0,1,2,3,....
14
Patterns of ACF
Exponential decay with/out cut-off (stationary series)
Close to 1 with very slow decay (non-stationary, unit root)
Seasonal peaks (presence of seasonality)
ACF: exponential decline with cut-off
1.0
0.4
0.5
0.0
0.0
-0.4
-0.5
-0.8
-1.2
-1.0
1
10
11
12
10
11
12
1.00
0.8
0.6
0.95
0.4
0.2
0.90
0.0
-0.2
-0.4
0.85
1
BF-05
10
11
12
10
12
14
16
18
20
22
24
15
, t-1, t
16
Sample PACF
Computation of p ( )
p (0) = 1;
For each = 1,2,3,..., run OLS of yt on [1, yt 1 ,..., yt ] and
p ( ) = estimated coefficient on yt .
eg.
p (1) is the estimated 11 in the regression
yt = 10 + 11 yt-1 + error.
p (2) is the estimated 22 in the regression
yt = 20 + 21 yt-1 + 22 yt-2 + error.
p (3) is the estimated 33 in the regression
yt = 30 + 31 yt-1 + 32 yt-2 + 33 yt-3 + error.
BF-05
17
Theoretical PACF
Theoretical PACF can be described as the limit of
sample PACF as sample size increases to infinity.
Or, p() is defined as a function of autocovariances
[(0), (1), , ()],
p ( ) = ,
1 (0)
(1)
2
( 1)
( 2)
=
(
1
)
(
2
)
(
0
)
BF-05
(1)
( 0)
(1)
(2)
.
(
)
18
8
7
6
5
4
Y_SF
Y_SA
Y_TC
Y_IR
2
1
0
-1
82 84 86 88 90 92 94 96 98 00 02 04
BF-05
84
86
88
90
92
94
96
98
00
02
04
19
BF-05
1, if = 0
p ( ) =
.
0, if 0
20
( ) =
t =1
( )
,
(0)
1
( ) =
T
t = +1
)( t ),
= 0, 1, 2, 3, ... ,
T 2 ( ) approx. ~ 2 (1).
21
Ljung-Box test
1
2 ( ),
=1 T
QLB = T (T + 2)
approx. ~ 2 (m),
22
BF-05
84
86
88
90
92
94
96
98
00
02
04
23
yt = bi t i ,
i =0
where
2
b
i < ,
i =0
t ~ WN(0, 2 ),
b0 = 1.
yt
,
( t i / )
i = 0,1,2,...
24
Conditional moments
Unconditional moments E(yt), Var(yt), Cov(yt, yt+)
do not take into account observed information.
Information set t-1 = {yt-1, yt-2, yt-3, } holds
observations up to t 1.
The conditional moments on given t-1
E(yt|t-1), Var(yt|t-1), Cov(yt, yt+|t-1)
are most relevant in forecasting.
BF-05
25
1000
800
'000
eg.
Predict
at Feb/82
Predict
at Feb/88
Predict
at Feb/94
600
400
200
0
Feb-98
Feb-96
Feb-94
Feb-92
Feb-90
Feb-88
Feb-86
Feb-84
Feb-82
Feb-80
Feb-78
26
Conditional moments
eg. Conditional mean and variance of AR(1)
t-1 = {yt-1, yt-2, yt-3, },
yt = a + byt-1 + t,
t = , 2, 1, 0, 1, 2, 3, ,
t ~ iid WN(0, 2) , indepedent of t-1,
(a, b, 2) being constant parameters.
E ( yt | t 1 ) = E (a + byt 1 + t | t 1 )
= E (a + byt 1 | t 1 ) + E ( t | t 1 )
by Rule 2
= a + byt 1 + E ( t ) = a + byt 1.
var( yt | t 1 ) = E [ yt E ( yt | t 1 )]2 t 1
= E (t2 | t 1 ) = E (t2 ) = 2 .
BF-05
by Rule 3,4
)
by Rule 4
27
Summary
What is a stochastic process (SP)?
Which component of a time series is treated as a SP?
What is a strictly stationary SP? A covariance
stationary SP? What are the differences/similarities?
What is the autocorrelation function (ACF) of a SP?
And partial ACF (PACF)?
How do you compute sample ACF and PACF?
What is a white noise (WN) process? Is it stationary?
What are its ACF and PACF?
What does Wolds representation theorem tell you?
How do you find conditional expectations?
BF-05
28