Professional Documents
Culture Documents
Assessment 3
Instructions: the assessment contains five questions. Questions one, two and three must be answered,
but you can choose between answering question four or question five. Good luck!
Losung. Write the ARMA as (B)Xt = (B)Wt . The process {Xt } is causal if and only if
(z) 6= 0 for each |z| 1 and invertible if and only if (z) =
6 0 for each |z| 1.
(a) (z) = 1 + 0.2z 0.48z 2 = 0 is solved by z1 = 5/3 and z2 = 5/4. Hence {Xt } is causal.
(z) = 1. Hence {Xt } is invertible.
(b) (z) = 1 + 1.9z + 0.88z 2 = 0 is solved by z1 = 10/11 and z2 = 5/4. Hence {X t } is not
2
causal. (z) = 1+0.2z+0.7z
= 0 is solved by z1 = (1 69i)/7, and z2 = (1+ 69i)/7.
Since |z1 | = |z2 | = 70/7 > 1, {Xt } is invertible.
(c) (z) = 1 + 1.6z = 0 is solved by z = 5/8. Hence {Xt } is not causal. (z) = 1 0.4z +
0.04z 2 = 0 is solved by z1 = z2 = 5. Hence {Xt } is invertible.
2. Forecast (7pts)
1
i. Compute the asymptotic distribution of 2 for = 1, 1 = 1.5 and 2 = 0.75; and
the corresponding confidence intervals. For the latter use = 0.05 and n = 103 .
ii. What is the consequence of fitting an AR(2) model to an AR(1) process in terms of
the variance of 1 ?
Losung. (a) Note that Xt can be written as a deterministic function of its lags.
Note that
with solution:
2 (2 1) 2 1
(0) = , (1) =
((2 1) 2 21 ) (2 + 1) ((2 1) 2 21 ) (2 + 1)
2 21 22 + 2
(2) = ,
((2 1) 2 21 ) (2 + 1)
hence
2 (2 1) 2 1
122
" #
((2 1)2 21 )(2 +1) ((2 1)2 21 )(2 +1) 1 (22+1)
2 = 2 1 2 (2 1)
1
2
2
1 (2 +1) 122
( 1) 2 2
((2 1)2 21 )(2 +1) ( 2 2 21 )(2 +1)
leading to the expression
2 1 1 22
1 1 (1 + 2 ) 1 0.4375 0.375
= =
n 2 n 1 (1 + 2 ) 1 22 n 0.375 0.4375
2
ii. Suppose {Xt } is an AR(1) process and the sample size n is large. If we estimate 1 ,
we have
1 21
V[1 ] = .
n
If we fit an AR(2) to this AR(1) process,
1 22 1 1 21
V[1 ] = = > ,
n n n
that is, the variance in the AR(2) process for the estimation of 1 is larger. For the
confidence intervals we obtain
1/2 0.661438
1 1/2 1 2 = 1 1.96 = 1 0.0409963
n 11 10 10
1/2 0.661438
2 1/2 1 2 = 2 1.96 = 2 0.0409963,
n 22 10 10
Xt = Wt1 + Wt , Wt WN (0, 2 ).
Use the projection theorem to find the value at lag 2 of the corresponding PACF, i.e. find
2,2 in X32 = 2,1 X2 + 2,2 X1 .
(b) Consider the stationary AR(1) process
Xt = Xt1 + Wt , Wt WN (0, 2 ).
Assume you observed X1 and X3 , and you would like to estimate missing value X2 . Find
the best linear predictor of X2 given X1 and X3 .
Cov(X3 X32 , Xi ) = 0, i = 1, 2.
Hence
and
Cov(X3 X32 , X2 ) = Cov(X3 2,1 X2 2,2 X1 , X2 ) = (1) 2,1 (0) 2,2 (1) = 0.
3
Thus, we have to solve the equations
2,1 (1) + 2,2 (0) = 0
(1 2,2 )(1) 2,1 (0) = 0.
Solving this system of equations we find
2
2,2 =
4 + 2 + 1
(b) We are interested in X21,3 = 2,1 X1 + 2,3 X3 . Following the same idea as in part (a), it
follows that
1 2 2,1
2,1 1
= = ,
2 1 2,2 2,2 1+ 1
2
hence:
X21,3 = (X1 + X3 )
1 + 2
Losung. (a) Following the results from question 2 b, one arrives to the difference equation
2 (2 1)
h = 2 h2 + 1 h1 with 0 =
((2 1) 2 21 ) (2 + 1)
2 1
1 = .
((2 1) 21 ) (2 + 1)
2
4
(b) Note that
n(p p ) N (0, 2 1 ), as n ,
and that with probability 1 , the j-th element of p , p,j , is in the interval
1/2
p,j 1/2 1
p ,
n jj
where 1/2 is the 1 /2 quantile of the standard normal. Moreover, notice thath
n o n o 2
V 1/2 1/2 1/2
p (p p ) = p V p p p = I,
n
Thus,
2
1/2 n >
v= (p p ) N 0, I v v 2 (p),
n 2