Professional Documents
Culture Documents
is stationary. [2]
process and find a general
written in the form:
terms of the constants a
n xi xi
i
L e
-a s
=
?
ps ? . [3]
(ii) Show that the maximum likelihood estimate of a can also be regarded as a
least squares estimate. [2]
(iii) Find the maximum likelihood estimates of a and s2. [4]
(iv) Derive the Yule-Walker equations for the model and hence derive estimates o
f
a and s2 based on observed values of the autocovariance function. [5]
(v) Comment on the difference between the estimates of a in parts (iii) and (iv)
.
[1]
[Total 15]
(i) State the three main stages in the Box-Jenkins approach to fitting an ARIMA
time series model. [3]
(ii) Explain, with reasons, which ARIMA time series would fit the observed data
in the charts below. [2]
Now consider the time series model given by
Xt = a1Xt-1 + a2Xt-2 + 1et-1 + et
where et is a white noise process with variance s2.
(iii) Derive the Yule-Walker equations for this model. [6]
(iv) Explain whether the partial auto-correlation function for this model can ev
er
give a zero value. [2]
[Total 13]
A sequence of 100 observations was made from a time series and the following val
ues
of the sample auto-covariance function (SACF) were observed:
Lag SACF
1 0.68
2 0.55
3 0.30
4 0.06
The sample mean and variance of the same observations are 1.35 and 0.9 respectiv
ely.
(i) Calculate the first two values of the partial correlation function 1 f and f
. [1]
(ii) Estimate the parameters (including s2) of the following models which are to
be fitted to the observed data and can be assumed to be stationary.
(a) Yt = a0 + a1 Yt-1 + et
(b) Yt = a0 + a1 Yt-1 + a2 Yt-2 + et
In each case et is a white noise process with variance s2. [12]
(iii) Explain whether the assumption of stationarity is necessary for the estima
tion
for each of the models in part (ii). [2]
(iv) Explain whether each of the models in part (ii) satisfies the Markov proper
ty.
[2]
(i) List the main steps in the Box-Jenkins approach to fitting an ARIMA time
series to observed data. [3]
Observations x1, x2, , x200 are made from a stationary time series and the follow
ing
summary statistics are calculated:
200 200 200
2
1 1 2 1
i 83.7 ( i ) 35.4 ( i )( i ) 28.4 i i i
x x x x x x - x
= = =
S = S - = S - - =
200
3 2
( i )( i ) 17.1 i
x x x - x
=
S - - =
(ii) Calculate the values of the sample auto-covariances ? 0, ? 1 and ? 2. [3]
(iii) Calculate the first two values of the partial correlation function 1 f and
2 f .
[3]
The following model is proposed to fit the observed data:
Xt - = a1 (Xt-1 - ) + et
where et is a white noise process with variance s2.
(iv) Estimate the parameters , a1 and s2 in the proposed model. [5]
After fitting the model in part (iv) the 200 observed residual values e t were
calculated. The number of turning points in the residual series was 110.
(v) Carry out a statistical test at the 95% significance level to test the hypot
hesis
that e t is generated from a white noise process. [4]
he
scaled deviance. [5]
The following time series model is used for the monthly inflation rate (Yt) in a
particular country:
Yt = 0.4Yt 1 + 0.2Yt 2 + Zt + 0.025
where {Zt} is a sequence of uncorrelated identically distributed random variable
s
whose distributions are normal with mean zero.
(i) Derive the values of p, d and q, when this model is considered as an
ARIMA(p, d, q) model. [3]
(ii) Determine whether {Yt} is a stationary process. [2]
(iii) Assuming an infinite history, calculate the expected value of the rate of
inflation over this. [1]
iv) Calculate the autocorrelation function of {Yt}. [5]
(v) Explain how the equivalent infinite-order moving average representation of
{Yt} may be derived. [2]
(i) Derive the autocovariance and autocorrelation functions of the AR(1) process
Xt = Xt 1 + et
where < 1 and the et form a white noise process. [4]
(ii) The time series Zt is believed to follow a ARIMA(1, d, 0) process for some
value of d. The time series (k )
Zt is obtained by differencing k times and the
sample autocorrelations, {ri : i = 1, , 10}, are shown in the table below for
various values of k.
k = 0 k = 1 k = 2 k = 3 k = 4 k = 5
r1 100% 100% 83% 3% 45% 64%
r2 100% 100% 66% 12% 5% 13%
r3 100% 100% 54% 11% 4% 3%
r4 100% 99% 45% 1% 6% 4%
r5 100% 99% 37% 3% 4% 5%
r6 100% 99% 30% 12% 12% 12%
r7 99% 98% 27% 3% 7% 9%
r8 99% 98% 24% 3% 0% 4%
r9 99% 97% 19% 3% 5% 6%
r10 99% 97% 13% 7% 5% 4%
Suggest, with reasons, appropriate values for d and the parameter in the
underlying AR(1) process. [4]
State the Markov property and explain briefly whether the following processes ar
e
Markov:
AR(4);
ARMA (1, 1).
A modeller has attempted to fit an ARMA(p,q) model to a set of data using the Bo
xJenkins methodology. The plot of residuals based on this proposed fit is shown
below.
(i) Under the assumptions of the model, the residuals should form a white noise
process.
(a) By inspection of the chart, suggest two reasons to suspect that the
residuals do not form a white noise process.
(b) Define what is meant by a turning point.
(c) Perform a significance test on the number of turning points in the data
above. (There are 100 points in the data and 59 turning points.)
[6]
(ii) On your suggestion, the original fitted model is discarded, and reparameter
ised
to:
Xn+2 = 5 + 0.9(Xn+1 -5) + en+2 + 0.5en .
Given the following observations:
99 100
99 100
= 2, = 7
= 0.7,
=1.4
X X
e - e
Use the Box-Jenkins methodology to calculate the forward estimates
X100 (1), X100 (2) and X100 (3) . [4]
The time series Xt is assumed to be stationary and to follow an ARMA (2,1) proce
ss
defined by:
Xt = 1 + 8
15
Xt-1 1
15
Xt-2 + ?t
1
7 t-1 ?
where Zt are independent N(0,1) random variables.
(i) Determine the roots of the characteristic polynomial, and explain how their
values relate to the stationarity of the process. [2]
(ii) (a) Find the autocorrelation function for lags 0, 1 and 2.
(b) Derive the autocorrelation at lag k in the form
?k = k
A
c
+ k
B
d
[12]
(iii) Determine the mean and variance of Xt. [3]
(i) Express in terms of 1 and 4 the roots of the characteristic polynomial of the
MA part, and give conditions for invertibility of the model. [2]
(ii) Derive the autocorrelation function (ACF) for Yt. [5]
For our particular data the sample ACF is:
Lag ACF
1 0.73
2 0.14
3 0.37
4 0.59
5 0.24
6 0.12
7 0.07
(iii) Explain whether these results confirm the initial belief that the model co
uld be
appropriate for these data. [3]
[Total 10]
(i) Describe the difference between strictly stationary processes and weakly
stationary processes. [2]
(ii) Explain why weakly stationary multivariate normal processes are also strict
ly
stationary. [1]
(iii) Show that the following bivariate time series process, (Xn, Yn)t , is weak
ly
stationary:
Xn = 0.5Xn-1 + 0.3Yn-1 + x
en
Yn = 0.1Xn-1 + 0.8Yn-1 + y
en
where x
en and x
en are two independent white noise processes. [5]
(iv) Determine the positive values of c for which the process
Xn = (0.5 + c) Xn-1 + 0.3Yn-1 + x
en
Yn = 0.1Xn-1 + (0.8 + c) Yn-1 + y
en
is stationary. [6]
[Total 14]
The following data is observed from n = 500 realisations from a time series:
1
13153.32
n
i
i
x
=
S = , 2
1
( ) 3153.67
n
i
i
x x
=
S - = and
1
1
1
( )( ) 2176.03
n
i i
i
x x x x
+
=
S - - = .
(i) Estimate, using the data above, the parameters , a1 and s from the model
Xt - = a1(Xt-1 - ) + et
where et is a white noise process with variance s2. [7]
(ii) After fitting the model with the parameters found in (i), it was calculated
that
the number of turning points of the residuals series e t is 280.
Perform a statistical test to check whether there is evidence that e t is not
generated from a white noise process. [3]
[Total 10]
The following two models have been suggested for representing some quarterly dat
a
with underlying seasonality.
Model 1 Yt = aYt-4 + et
Model 2 Yt = et-4 + et
Where et is a white noise process in each case.
(i) Determine the auto-correlation function for each model. [4]
The observed quarterly data is used to calculate the sample auto-correlation.
(ii) State the features of the sample auto-correlation that would lead you to pr
efer
Model 1. [1]
[Total 5]