You are on page 1of 22

Introduction Stable VAR Processes Vector error correction models

Vector autoregressions
Based on the book New Introduction to Multiple Time Series
Analysis by Helmut L

utkepohl
Robert M. Kunst
robert.kunst@univie.ac.at
University of Vienna
and
Institute for Advanced Studies Vienna
November 23, 2011
Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna
Introduction Stable VAR Processes Vector error correction models
Outline
Introduction
Stable VAR Processes
Basic assumptions and properties
Forecasting
Structural VAR analysis
Vector error correction models
Univariate integrated processes
Integrated VAR processes
Cointegrated VAR processes
Deterministics in the VECM
Causality and impulse response analysis
Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna
Introduction Stable VAR Processes Vector error correction models
Univariate integrated processes
Review: univariate integrated AR processes
If a univariate autoregressive process
y
t
= + a
1
y
t1
+. . . + a
p
y
tp
+ u
t
has inverted roots of its characteristic polynomial
1
, . . . ,
p
such
that d roots are exactly one and the remaining roots full |
j
| < 1,
then it is unstable, whereas its dth dierence

d
y
t
= (1 L)
d
y
t
is stable. It is then called integrated of order d or I(d).
d = 1 is the most important case. The random walk is I(1).
Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna
Introduction Stable VAR Processes Vector error correction models
Univariate integrated processes
Univariate integrated processes
More general denitions of integrated processes do not assume the
process to be autoregressive. Then, a process is called I (d) i its
dth dierence has a Wold-type MA representation

d
y
t
=

j =1

j
u
tj
with (1) =

j =0

j
= 0 and

j =0
j |
j
| < .
The anking conditions guarantee uniqueness of d and
convergence of the so-called Beveridge-Nelson decomposition.
Other authors may require dierent anking conditions.
Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna
Introduction Stable VAR Processes Vector error correction models
Univariate integrated processes
The Beveridge-Nelson decomposition
One can show that every I(1) process has a unique representation
of the form
y
t
= y
0
+(1)
t

s=1
u
s
+

j =0

j
u
tj
w

0
,
with

j
=

i =j +1

i
and w

0
=

j =0

j
u
j
reecting starting
conditions.
Essentially, any I(1) process can be decomposed into a random
walk and a stable remainder. A comparable decomposition for
multivariate processes was established by the Granger
representation theorem.
Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna
Introduction Stable VAR Processes Vector error correction models
Integrated VAR processes
Integrated VAR processes: the concept
Consider a VAR without intercept
A(L)y
t
= u
t
with the characteristic polynomial A(z) = I
K
A
1
z . . . A
p
z
p
.
Formally, any matrix A can be represented as a product of its
determinant and an invertible matrix, whose inverse is called its
adjoint A
adj
. Thus,
|A(L)|y
t
= A(L)
adj
u
t
always works. If m roots of |A(z)| are one and the remaining ones
are larger than one, i.e.
|A(L)| = (1 L)
m
(L),
with (.) invertible, the process y
t
is called integrated. However,
m need not be the desired d.
Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna
Introduction Stable VAR Processes Vector error correction models
Integrated VAR processes
Two role-model examples for integrated VAR processes
The VAR(1)
y
t
=
_
1 0
0 1
_
y
t1
+ u
t
has determinant (1 z)
2
. Here, y
t
should be I(1).
The VAR(2)
y
t
=
_
2 0
0 0
_
y
t1
+
_
1 0
0 0
_
y
t2
+ u
t
has determinant (1 z)
2
. Here, y
t
should be I(2).
Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna
Introduction Stable VAR Processes Vector error correction models
Integrated VAR processes
Integration order for vector autoregressions
Denition
A vector autoregressive process (y
t
) (whose determinant has only
roots at one or outside the unit circle) is called integrated of order
d or I(d) i
d
y
t
is stable but
d1
y
t
is not stable.
Remark: Usually, the dierenced processes do not have nite-order
VAR representations. Some components often have individual
lower integration order than d and may also be stationary.
Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna
Introduction Stable VAR Processes Vector error correction models
Cointegrated VAR processes
Cointegrated VAR processes: the concept
Denition
Let (y
t
) be a vector autoregressive process of order p that is
integrated of order d. It is called cointegrated of order (d, b) or
CI(d,b) i there exists a linear combination z
t
=

y
t
with
= (
1
, . . . ,
K
)

= 0 such that z
t
is I (d b).
Remarks:

If a component of y
t
, say the rst, is integrated of lower order
than d, the conditions are fullled for = (1, 0, . . . , 0)

: there
is self-cointegration, although it may not correspond to the
original idea;

The case CI (1, 1) is the most important one;

The linear combination is called a cointegrating vector.


Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna
Introduction Stable VAR Processes Vector error correction models
Cointegrated VAR processes
Cointegrated VAR: role-model case
Consider the VAR(1)
y
t
=
_
1 0
1 0
_
y
t1
+ u
t
,
which has determinant (1 z) and is I(1). The combination
(1, 1)y
t
= y
1,t
y
2,t
is white noise, and = (1, 1)

is a
cointegrating vector. Note the equivalent representation
y
t
=
_
0 0
1 1
_
y
t1
+ u
t
,
which shows the vector in the coecient matrix.
Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna
Introduction Stable VAR Processes Vector error correction models
Cointegrated VAR processes
Properties of cointegrating vectors

Cointegrating vectors are not unique. Even when there is only


one cointegrating vector, all multiples will also cointegrate;

If there are several linear independent cointegrating vectors, all


their linear combinations also cointegrate. The cointegrating
vectors can be seen as a basis of a linear cointegrating space;

The dimension of the cointegrating space, the cointegrating


rank, is unique.
Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna
Introduction Stable VAR Processes Vector error correction models
Cointegrated VAR processes
The vector error-correction model
Any VAR(p) model y
t
= A
1
y
t1
+. . . + A
p
y
tp
+ u
t
can be
re-written in the error-correction form (vector error-correction
model, VECM)
y
t
= y
t1
+
p1

j =1

j
y
tj
+ u
t
,
with a one-to-one functional relation between the parameters
(A
1
, . . . , A
p
) and (,
1
, . . . ,
p1
). In particular,
= (I A
1
. . . A
p
),
j
= (A
j +1
+. . . + A
p
).
Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna
Introduction Stable VAR Processes Vector error correction models
Cointegrated VAR processes
Cointegration in the VECM
If (y
t
) is integrated, will have reduced rank r < K. Any matrix
of rank r < K can be decomposed as
=

,
with and of rank r and dimension K r . Thus,
y
t
=

y
t1
+
p1

j =1

j
y
tj
+ u
t
,
and contains the cointegrating vectors if (y
t
) is CI(1,1). Note
that all terms y
tj
(j = 0, . . . , p 1), u
t
, and

y
t1
are
stationary. is called the loading matrix.
Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna
Introduction Stable VAR Processes Vector error correction models
Cointegrated VAR processes
A suggested normalization
While is unique, the decomposition and hence and are not.
A potential normalization is to impose unit length on all columns
of . Alternatively, normalize to the form
=
_
I
r

Kr
_
.
With self-cointegration, the corresponding column in is a unit
vector. Stationary component variables of y
t
should be among the
rst r positions.
Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna
Introduction Stable VAR Processes Vector error correction models
Cointegrated VAR processes
The economic interpretation

Economists see the vectors in as dynamic equilibrium


conditions: the stationary islands in a wild integrated sea;

Loading matrices describe how component variables react to


deviations from equilibrium:
ij
shows how component i
reacts to deviations from equilibrium condition j ;


j
represent short-run dynamics in the system (inertia).
Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna
Introduction Stable VAR Processes Vector error correction models
Cointegrated VAR processes
Grangers Representation Theorem
Proposition
Suppose
y
t
=

y
t1
+
p1

j =1

j
y
tj
+ u
t
, t = 1, 2, . . . ,
with y
t
= u
t
= 0, t 0 and u
t
is white noise for t > 0. The
corresponding polynomial C(z) is dened by
C(z) = (1 z)I
K

z
p1

j =1

j
(1 z)z
j
.
Assume (a) |C(z)| = 0 |z| > 1 or z = 1; (b) The number of
unit roots is K r ; (c) and have dimension K r and rank r .
Then y
t
can be represented as
y
t
=
t

j =1
u
j
+

(L)u
t
+ y

0
.
Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna
Introduction Stable VAR Processes Vector error correction models
Cointegrated VAR processes
Some details for the representation theorem
=

_
_
_

_
_
I
K

p1

j =1

j
_
_

_
_
_
1

describes the weight of the pure (K r )dimensional random


walk in the multivariate Beveridge-Nelson decomposition. y

0
contains initial values.

(L)u
t
is a convergent stable process, with

j
some (complex) function of the original parameters.
Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna
Introduction Stable VAR Processes Vector error correction models
Deterministics in the VECM
Deterministic terms in the VECM
Suppose
y
t
=
t
+ x
t
,
with x
t
non-deterministic and (1)
t
=
0
or (2)
t
=
0
+
1
t.
For (1), substitute into the VECM
y
t
=

(y
t1

0
) +
p1

j =1

j
y
tj
+ u
t
,
as
0
= 0. The term

(y
t1

0
) can be re-written as
(

0
)
_
y
t1
1
_
.
Formally, this looks as if y were cointegrated with the constant 1.
Note that y is not trending here.
Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna
Introduction Stable VAR Processes Vector error correction models
Deterministics in the VECM
The trending case of the VECM
Suppose case (2), i.e.
t
=
0
+
1
t. Substitute into the VECM
y
t

1
=

{y
t1

1
(t 1)} +
p1

j =1

j
(y
tj

1
) +u
t
,
as (
1
t) =
1
. The term

{y
t1

0

1
(t 1)} can be
re-written as
(

1
)
_
y
t1
t 1
_

0
.
Formally, this looks as if y were cointegrated with the linear time
trend. Note that y is I(1) plus linear trend here, while

y is
trend-stationary but not stationary.
Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna
Introduction Stable VAR Processes Vector error correction models
Deterministics in the VECM
The trending case of the VECM without explicit t
If

1
= 0, then the corresponding term disappears and the
system can be written as
y
t
= +

y
t1
+
p1

j =1

j
y
tj
+ u
t
,
with = (I
K

1
. . .
p1
)
1

0
. This is the often
estimated VECM for trending variables. y is I(1) plus linear trend,
while

y is stationary.
It is debatable whether the restriction relative to the previous
model is natural in applications. A cointegrating vector that yields
a trend-stationary variable instead of a stationary one may be of
little interest.
Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna
Introduction Stable VAR Processes Vector error correction models
Causality and impulse response analysis
Dynamic analysis in the VECM
Forecasting, causality, impulse response analysis: what changes in
an integrated VAR relative to the stable VAR?

Forecasting: the same calculus applies, the prediction variance


increases without nite bounds as the horizon h increases,
while the prediction variance in the cointegrating directions
remains bounded;

Causality: there are two sources of causality in a cointegrated


VAR, short- and long-run causality;

IRF: response functions approach a non-zero constant in all


integrated directions, while they approach zero for
cointegrated directions, including potential stationary
components (self-cointegration).
Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna
Introduction Stable VAR Processes Vector error correction models
Causality and impulse response analysis
Causality analysis in the VECM
For two sub-vectors y = (z

, x

, consider the partitioned


representation of the VECM
_
z
t
x
t
_
=
_

11

12

21

22
_ _
z
t1
x
t1
_
+
p1

j =1
_

11,j

12,j

21,j

22,j
_ _
z
tj
x
tj
_
+ u
t
.
Absence of causality from x to z requires
12
= 0 and

12,j
= 0, j = 1, . . . , p 1. The former event means that z does
not adjust to the disequilibrium

y that may include x. The latter


one means that z is unaected by lagged short-run eects in
x.
Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna

You might also like