Professional Documents
Culture Documents
For a set of n time series variables yt ( y1t , y 2t , ..., y nt )' , a VAR model of order p
(VAR(p)) can be written as:
(1) yt A1 yt 1 A2 yt 2 ... Ap yt p ut
where the Ai s are (nxn) coefficient matrices and ut (u1t , u 2t ,..., u nt )' is an
unobservable i.i.d. zero mean error term.
The roots of ( z ) = 0 will give the necessary information about the stationarity or
nonstationarity of the process.
The necessary and sufficient condition for stability is that all characteristic roots lie
outside the unit circle. Then is of full rank and all variables are stationary.
In this section, we assume this is the case. Later we allow for less than full rank matrices
(Johansen methodology).
3 8 1
0
Ex: A= 4 3
0 3 4
3 8 1 0 0 3 8 1
0 4 3 0
0 0 4 3 .
0 3 4 0 0 0 3 4
The associated eigenvectors are those that satisfy the equations for the three distinct
values of the eigenvalues.
The eigenvector associated with 1 3 , which satisfies the equation for this matrix is
found as
0 8 1 c11 0
0 1 3 c12 0 . Notice that only columns 2 and 3 are linearly independent
0 3 1 c 22 0
(rank=2) so we can choose the first element of the c matrix arbitrarily. Set c11 1 and the
c11 1
other two elements are = 0 c12 0 .
c13 0
Similarly, the eigenvector associated with 2 7 , which satisfies the equation for this
matrix is found as
4 8 1 c 21 0
0 3 3 c 22 0 .
0 3 3 c 23 0
Notice that rk(A)=2 again because this time the last two rows are linearly dependent.
Thus only the 2x2 matrix on the LHS is nonsingular. We can delete the last row and
move c 23 multiplied by the last column to the RHS. Now the first two elements will be
expressed in terms of the last element. We can fix arbitrarily c 23 and solve for the two
others: assume c 23 =4. Then c 2 (9,4,4)' is an eigenvector corresponding to the
eigenvalue 2 7
3 0 0 1 0 0
C AC 0
1
0
2 0
7 = 0
0 0 1 0 0 3
Thus, for any square matrix A, there is a nonsingular matrix C such that
(i) C 1 AC is diagonal with the eigenvalues on the diagonal.
(ii) The eigenvectors corresponding to distinct eigenvalues of a symmetric matrix are
orthogonal (linearly independent).
Each variable is expressed as a linear combination of itself and all other variables
(plus intercepts, dummies, time trends). The dynamics of the system will depend on
the properties of the A matrix.
The error term is a vector white noise process with E ( t ) 0 and
st
E ( t ' s ) where the covariance matrix is assumed to be positive
0 st
definite the errors are serially uncorrelated but can be contemporaneously
correlated.
Solution to 4:
(i) Homogenous equation:
Omit the error term yt b Ay t 1 simplest solution: yt yt 1 .. y . Then,
(5) y 1b if is nonsingular ( I A )
As a solution try dt . Substituting it in the homogenous (trivial solution) equation
(5):
(I A) d 0 ---eigenvalues
The nontrivial solution requires the determinant to be zero:
I A 0
Get the eigenvalues ( ' s ).
(ii) Substitute the eigenvalues into the homogenous system, to get the corresponding
eigenvectors ( C' s ).
(iii) After calculating the nonhomogenous solution and adding to the homogenous
equation, we obtain the complete solution (in matrix form):
(6) yt c11 c2 2 y
t t
wt b * wt 1 et , b* C 1b , C 1 AC , et C t ,
1
or:
(7) w1t b1 * 1 w1,t 1 e1t
w2 t b2 * 2 w2t 1 e2t
(iii) 1 1 and 2 1
Now w1 is a random walk with drift, or I(1), w2 is I(0). Each y is I(1) since
each y is a linear combination of both ws, therefore VAR is nonstationary.
Is there a linear combination of y1t and y 2t that removes the stochastic trend and
makes it I(0), i.e. both variables are cointegrated?
c11 * y1,t 1 c12 * y2,t 1
Consider again wt C yt = 1
where c* represent the
c21 * y1,t 1 c22 * y2,t 1
coefficients in the C 1 matrix. We know that w2 is I(0), thus [c21 * c22 *] is a
cointegrating vector.
Look for a Relation between the CI vector [c21 * c22 *] and the matrix such
that [.] c21
* *
c22 .
Reparameterize equation (3) to give:
(8) yt b yt 1 t where I A .
The eigenvalues of are the complements of the eigenvalues of A: i 1 i .
Since 1 1 the eigenvalues of are 0 and 1 2 . Thus, it is a singular matrix
with rank 1. Let us decompose . Since I A and A CC 1 , we can write
I C 1 AC (CI C )C 1 C ( I )C 1 .
Thus:
0 0 1 c11 c12 0 0 c11* *
c12
(9) C C 0 1 * *
c22
1 2
0 c21 2 c21 c22
c12 (1 2 )
c (1 ) c21 * c22 * '
22 2
So , which has a rank 1, is factorized into the product of a row vector and a
column vector , called an outer product:
-----------
Note: compare (9) to the case where is full rank with 1 0 :
1 1 0 1 c11 (1 1 ) c12 (1 2 ) c11 * *
c12 . You can see why
C 0 C *
1 2 c 21 (1 1 ) c 22 (1 2 ) c 21 c 22
*
Combining (8) and (9) we get the vector error correction model of the VAR:
y1t b1 c12 (1 2 )(c21 * y1,t 1 c22 * y2,t 1 ) 1t
(10)
y2t b2 c22 (1 2 )(c21 * y1,t 1 c22 * y2,t 1 ) 2t
y1t b1 c12 (1 2 )w2,t 1 1t
or:
Yt Yt 1 ut
But we cannot infer the loading matrix and the cointegrating matrix separately from
this. To find and separately, we need to calculate the eigenvector matrix:
Eigenvectors corresponding to 1 1 :
1.2 1 0.2 c11 0 0.2 0.2 c11 0 there is linear dependency
0.6 c 0
0.4 1 c12 0 0.6 0.6 12
So set c12 1 c1 1 1 '
0 0 1
Yt Yt 1 ut C C ut
0 1 2
0.4 y1,t 1
Yt 0.5 0.5 ut
1. 2 y2,t 1
This is the same expression as in (12) but now we have both the loading and the
cointegrating matrices:
0.4
and ' 0.5 0.5
1. 2
Set the error term=0 and examine the properties of the system.
We still have the LR solution (or the particular solution) as in (5)
y 1b but now I A1 A2 .
If all eigenvalues have modulus<1 then is non singular and the solution
yt i 1 ci t y
2k
will converge to y as t grows. The analysis w.r.t the modulus of the roots (<1, =1,
>1) is the same as in the VAR(1) case.
If the process is stationary then we can invert the VAR model and express y as a
function of present and past shocks, and the exogenous (deterministic)
components=Impulse Responses:
Ex: Calculate the roots of a 2-dimensional VAR(2): n=p=2 and find the effect of a
shock on a dependent variable: (Juselius Ch. 3)
The characteristic function of ( z ) I A1 z A2 z 2 where z 1 is
1,11 1,12 2,11 2,12 2
( z) I z z
1, 21 1, 22 2 , 21 2 , 22
Therefore
( z ) (1 1,11 z 2,11 z 2 )(1 1, 22 z 2 , 22 z 2 ) ( 1,12 z 2,12 z 2 )( 1, 21 z 2, 21 z 2 )
a ( L) jt a ( L) jt for t=1,.T.
yit
( z) (1 2 z )(1 3 z )(1 4 z ) (1 1 z )
We are assuming that all roots have modulus less than 1. The characteristic roots give
information about the dynamic behavior of the process. To see how the shock is
propagated, expand the last component:
(1 1 ) 1 jt (1 1 L 12 L2 ...) jt .
You will have to do the same thing with each root. Thus, each shock will affect
current and future values of yi .
The persistence of the shock depends on the magnitude of the roots. The larger they
are the more persistent will be the shocks.
-If the roots i are real and <1, the shock will exponentially die out.
-If one or more root i is imaginary then a shock will be cyclical but
exponentially declining.
-If one or more roots i lies on the unit circle, the shock will be permanent and
and yi will show nonstationary behavior. VAR is not invertible, then we need to
look into VECM
We can also calculate the roots by reformulating the VAR(p) into the companion
matrix VAR(1) form and solve for the two eigenvalues:
yt A1 yt 1 A2 yt 2 ut
yt 1 yt 1
In matrix form:
yt 1 AA 2yt1 ut
y t1 Ip 0yt2 0
Calculate the eigenvalues i from the coefficient matrix:
I 0 A A2 A1 A2
I A 2 1 =0= 2 A1 A2 0
0 I 2 I 2 0 I2
( 1 )( 2 ) 0 .
Now we get the roots directly instead of the zs, which were the inverse of the roots,
obtained by solving the characteristic polynomial. Johansen and Juselius refer to the s
as eigenvalues roots and to zs characteristic roots.
In the case of the companion matrix, there are two roots. If the roots to the characteristic
polynomial are outside the unit circle, then the eigenvalues of the companion matrix are
inside the unit circle and the system is stable.
To recap:
-The solution to I zA 0 gives the stationary roots (characteristic roots)
outside the unit circle.
-The solution to I A 0 gives the stationary roots (eigenvalues) inside the
unit circle.
-If the roots of ( z ) are all outside the unit circle or the eigenvalues of the
companion matrix are inside the unit circle, the process is stationary.
-If one or more of the roots of ( z ) or those of the companion matrix are on
the unit circle then the process is nonstationary.
-If one or more roots of ( z ) is inside the unit circle or the eigenvalues of the
companion matrix are outside the unit circle, the process is explosive.