You are on page 1of 34

Introduction to Time Series Analysis. Lecture 5.

Peter Bartlett
www.stat.berkeley.edu/bartlett/courses/153-fall2010
Last lecture:
1. ACF, sample ACF
2. Properties of the sample ACF
3. Convergence in mean square
1
Introduction to Time Series Analysis. Lecture 5.
Peter Bartlett
www.stat.berkeley.edu/bartlett/courses/153-fall2010
1. AR(1) as a linear process
2. Causality
3. Invertibility
4. AR(p) models
5. ARMA(p,q) models
2
AR(1) as a linear process
Let {X
t
} be the stationary solution to X
t
X
t1
= W
t
, where
W
t
WN(0,
2
).
If || < 1,
X
t
=

j=0

j
W
tj
is the unique solution:
This innite sum converges in mean square, since || < 1 implies

|
j
| < .
It satises the AR(1) recurrence.
3
AR(1) in terms of the back-shift operator
We can write
X
t
X
t1
= W
t
(1 B)

(B)
X
t
= W
t
(B)X
t
= W
t
Recall that B is the back-shift operator: BX
t
= X
t1
.
4
AR(1) in terms of the back-shift operator
Also, we can write
X
t
=

j=0

j
W
tj
X
t
=

j=0

j
B
j

(B)
W
t
X
t
= (B)W
t
5
AR(1) in terms of the back-shift operator
With these denitions:
(B) =

j=0

j
B
j
and (B) = 1 B,
we can check that (B) = (B)
1
:
(B)(B) =

j=0

j
B
j
(1 B) =

j=0

j
B
j

j=1

j
B
j
= 1.
Thus, (B)X
t
= W
t
(B)(B)X
t
= (B)W
t
X
t
= (B)W
t
.
6
AR(1) in terms of the back-shift operator
Notice that manipulating operators like (B), (B) is like manipulating
polynomials:
1
1 z
= 1 +z +
2
z
2
+
3
z
3
+ ,
provided || < 1 and |z| 1.
7
Introduction to Time Series Analysis. Lecture 5.
1. AR(1) as a linear process
2. Causality
3. Invertibility
4. AR(p) models
5. ARMA(p,q) models
8
AR(1) and Causality
Let X
t
be the stationary solution to
X
t
X
t1
= W
t
,
where W
t
WN(0,
2
).
If || < 1,
X
t
=

j=0

j
W
tj
.
= 1?
= 1?
|| > 1?
9
AR(1) and Causality
If || > 1, (B)W
t
does not converge.
But we can rearrange
X
t
= X
t1
+W
t
as X
t1
=
1

X
t

W
t
,
and we can check that the unique stationary solution is
X
t
=

j=1

j
W
t+j
.
But... X
t
depends on future values of W
t
.
10
Causality
A linear process {X
t
} is causal (strictly, a causal function
of {W
t
}) if there is a
(B) =
0
+
1
B +
2
B
2
+
with

j=0
|
j
| <
and X
t
= (B)W
t
.
11
AR(1) and Causality
Causality is a property of {X
t
} and {W
t
}.
Consider the AR(1) process dened by (B)X
t
= W
t
(with
(B) = 1 B):
(B)X
t
= W
t
is causal
iff || < 1
iff the root z
1
of the polynomial (z) = 1 z satises |z
1
| > 1.
12
AR(1) and Causality
Consider the AR(1) process (B)X
t
= W
t
(with (B) = 1 B):
If || > 1, we can dene an equivalent causal model,
X
t

1
X
t1
=

W
t
,
where

W
t
is a new white noise sequence.
13
AR(1) and Causality
Is an MA(1) process causal?
14
Introduction to Time Series Analysis. Lecture 5.
1. AR(1) as a linear process
2. Causality
3. Invertibility
4. AR(p) models
5. ARMA(p,q) models
15
MA(1) and Invertibility
Dene
X
t
= W
t
+W
t1
= (1 +B)W
t
.
If || < 1, we can write
(1 +B)
1
X
t
= W
t
(1 B +
2
B
2

3
B
3
+ )X
t
= W
t

j=0
()
j
X
tj
= W
t
.
That is, we can write W
t
as a causal function of X
t
.
We say that this MA(1) is invertible.
16
MA(1) and Invertibility
X
t
= W
t
+W
t1
If || > 1, the sum

j=0
()
j
X
tj
diverges, but we can write
W
t1
=
1
W
t
+
1
X
t
.
Just like the noncausal AR(1), we can show that
W
t
=

j=1
()
j
X
t+j
.
That is, we can write W
t
as a linear function of X
t
, but it is not causal.
We say that this MA(1) is not invertible.
17
Invertibility
A linear process {X
t
} is invertible (strictly, an invertible
function of {W
t
}) if there is a
(B) =
0
+
1
B +
2
B
2
+
with

j=0
|
j
| <
and W
t
= (B)X
t
.
18
MA(1) and Invertibility
Invertibility is a property of {X
t
} and {W
t
}.
Consider the MA(1) process dened by X
t
= (B)W
t
(with
(B) = 1 +B):
X
t
= (B)W
t
is invertible
iff || < 1
iff the root z
1
of the polynomial (z) = 1 +z satises |z
1
| > 1.
19
MA(1) and Invertibility
Consider the MA(1) process X
t
= (B)W
t
(with (B) = 1 +B):
If || > 1, we can dene an equivalent invertible model in terms of a
new white noise sequence.
Is an AR(1) process invertible?
20
Introduction to Time Series Analysis. Lecture 5.
1. AR(1) as a linear process
2. Causality
3. Invertibility
4. AR(p) models
5. ARMA(p,q) models
21
AR(p): Autoregressive models of order p
An AR(p) process {X
t
} is a stationary process that satises
X
t

1
X
t1

p
X
tp
= W
t
,
where {W
t
} WN(0,
2
).
Equivalently, (B)X
t
= W
t
,
where (B) = 1
1
B
p
B
p
.
22
AR(p): Constraints on
Recall: For p = 1 (AR(1)), (B) = 1
1
B.
This is an AR(1) model only if there is a stationary solution to
(B)X
t
= W
t
, which is equivalent to |
1
| = 1.
This is equivalent to the following condition on (z) = 1
1
z:
z R, (z) = 0 z = 1
equivalently, z C, (z) = 0 |z| = 1,
where C is the set of complex numbers.
23
AR(p): Constraints on
Stationarity: z C, (z) = 0 |z| = 1,
where C is the set of complex numbers.
(z) = 1
1
z has one root at z
1
= 1/
1
R.
But the roots of a degree p > 1 polynomial might be complex.
For stationarity, we want the roots of (z) to avoid the unit circle,
{z C : |z| = 1}.
24
AR(p): Stationarity and causality
Theorem: A (unique) stationary solution to (B)X
t
= W
t
exists iff
(z) = 1
1
z
p
z
p
= 0 |z| = 1.
This AR(p) process is causal iff
(z) = 1
1
z
p
z
p
= 0 |z| > 1.
25
Recall: Causality
A linear process {X
t
} is causal (strictly, a causal function
of {W
t
}) if there is a
(B) =
0
+
1
B +
2
B
2
+
with

j=0
|
j
| <
and X
t
= (B)W
t
.
26
AR(p): Roots outside the unit circle implies causal (Details)
z C, |z| 1 (z) = 0
{
j
}, > 0, |z| 1 +,
1
(z)
=

j=0

j
z
j
.
|z| 1 +, |
j
z
j
| 0,

|
j
|
1/j
|z|

j
0
j
0
, j j
0
, |
j
|
1/j

1
1 +/2

j=0
|
j
| < .
So if |z| 1 (z) = 0, then S
m
=
m

j=0

j
B
j
W
t
converges in mean
square, so we have a stationary, causal time series X
t
=
1
(B)W
t
.
27
Calculating for an AR(p): matching coefcients
Example: X
t
= (B)W
t
(1 0.5B + 0.6B
2
)X
t
= W
t
,
so 1 = (B)(1 0.5B + 0.6B
2
)
1 = (
0
+
1
B +
2
B
2
+ )(1 0.5B + 0.6B
2
)
1 =
0
,
0 =
1
0.5
0
,
0 =
2
0.5
1
+ 0.6
0
,
0 =
3
0.5
2
+ 0.6
1
,
.
.
.
28
Calculating for an AR(p): example
1 =
0
, 0 =
j
(j 0),
0 =
j
0.5
j1
+ 0.6
j2
1 =
0
, 0 =
j
(j 0),
0 = (B)
j
.
We can solve these linear difference equations in several ways:
numerically, or
by guessing the form of a solution and using an inductive proof, or
by using the theory of linear difference equations.
29
Calculating for an AR(p): general case
(B)X
t
= W
t
, X
t
= (B)W
t
so 1 = (B)(B)
1 = (
0
+
1
B + )(1
1
B
p
B
p
)
1 =
0
,
0 =
1

1

0
,
0 =
2

1

1

2

0
,
.
.
.
1 =
0
, 0 =
j
(j < 0),
0 = (B)
j
.
30
Introduction to Time Series Analysis. Lecture 5.
1. AR(1) as a linear process
2. Causality
3. Invertibility
4. AR(p) models
5. ARMA(p,q) models
31
ARMA(p,q): Autoregressive moving average models
An ARMA(p,q) process {X
t
} is a stationary process that
satises
X
t

1
X
t1

p
X
tp
= W
t
+
1
W
t1
+ +
q
W
tq
,
where {W
t
} WN(0,
2
).
AR(p) = ARMA(p,0): (B) = 1.
MA(q) = ARMA(0,q): (B) = 1.
32
ARMA processes
Can accurately approximate many stationary processes:
For any stationary process with autocovariance , and any k >
0, there is an ARMA process {X
t
} for which

X
(h) = (h), h = 0, 1, . . . , k.
33
ARMA(p,q): Autoregressive moving average models
An ARMA(p,q) process {X
t
} is a stationary process that
satises
X
t

1
X
t1

p
X
tp
= W
t
+
1
W
t1
+ +
q
W
tq
,
where {W
t
} WN(0,
2
).
Usually, we insist that
p
,
q
= 0 and that the polynomials
(z) = 1
1
z
p
z
p
, (z) = 1 +
1
z + +
q
z
q
have no common factors. This implies it is not a lower order ARMA model.
34

You might also like