You are on page 1of 9

Observations 1 2 , , ,n y y

y are made from a random walk process given by:


Y0 = 0 and Yt = a +Yt-1 + et for t > 0
where et is a white noise process with variance s2 .
(i) Derive expressions for E(Yt) and Var(Yt) and explain why the process is not
stationary. [3]
(ii) Show that ?t,s = Cov(Yt ,Yt-s ) for s < t is linear in s. [2]
(iii) Explain how you would use the observed data to estimate the parameters a
and s. [3]
(iv) Derive expressions for the one-step and two-step forecasts for Yn+1 and Yn+
2 .
[2]
[Total 10]

A time series model is specified by


2
Yt = 2aYt-1 -a Yt-2 + et
where et is a white noise process with variance s2.
(i) Determine the values of a for which the process
(ii) Derive the auto-covariances ?0 and ?1 for this
recursive expression for ?k for k = 2. [10]
(iii) Show that the auto-covariance function can be
k k
?k = Aa + kBa
for some values of A, B which you should specify in
and s2. [5]
[Total 17]

is stationary. [2]
process and find a general
written in the form:
terms of the constants a

7 Consider the time series


Yt = 0.7 + 0.4Yt-1 + 0.12Yt-2 + et
where et is a white noise process with variance s2.
(i) Identify the model as an ARIMA(p,d,q) process. [1]
(ii) Determine whether Yt is a stationary process. [2]
(iii) Calculate E(Yt). [2]
(iv) Calculate the auto-correlations ?1, ?2, ?3 and ?4. [4]
[Total 9]

8 Consider the time series


Yt = 0.1 + 0.4Yt-1 + 0.9et-1 + et
where et is a white noise process with variance s2.
(i) Identify the model as an ARIMA(p,d,q) process. [1]
(ii) Determine whether Yt is:
(a) a stationary process
(b) an invertible process
[2]
(iii) Calculate E(Yt) and find the auto-covariance function for Yt. [6]
(iv) Determine the MA(8) representation for Yt. [4]
[Total 13]

Consider the time series model


(1 )3 -aB Xt = et
where B is the backwards shift operator and et is a white noise process with var
iance
s2 .
(i) Determine for which values of a the process is stationary. [2]
Now assume that a = 0.4.
(ii) (a) Write down the Yule-Walker equations.
(b) Calculate the first two values of the auto-correlation function ?1 and
?2 . [7]
(iii) Describe the behaviour of ?k and the partial autocorrelation function fk a
s
k ? 8 . [3]
[Total 12]

In order to model a particular seasonal data set an actuary is considering using


a
model of the form
(1- B3)(1-(a +) B + aB2 ) Xt = et
where B is the backward shift operator and et is a white noise process with vari
ance
s2 .
(i) Show that for a suitable choice of s the seasonal difference series
t t t s Y X X= - is stationary for a range of values of a and which you should
specify. [3]
After appropriate seasonal differencing the following sample autocorrelation val
ues
for the series Yt are observed: ? 1 = 0.2 and ? 2 = 0.7.
(ii) Estimate the parameters a and based on this information. [7]
[HINT: let X = a + , Y = a and find a quadratic equation with roots
a and .]
(iii) Forecast the next two observations x 101 and x 102 based on the parameters
estimated in part (ii) and the observed values x1, x2, , x100 of Xt . [4]
[Total 14]

An actuary is considering the time series model defined by


Xt = aXt-1 + et
where et is a sequence of independent Normally distributed random variables with
mean 0 variance s2. The series begins with the fixed value X0 = 0.
(i) Show that the conditional distribution of Xt given Xt-1 is Normal and hence
show that the likelihood of making observations 1 2 , , ,n x x x from this model
is:
2
1
2
( )
2
1
1
2

n xi xi
i
L e
-a s
=
?
ps ? . [3]
(ii) Show that the maximum likelihood estimate of a can also be regarded as a
least squares estimate. [2]
(iii) Find the maximum likelihood estimates of a and s2. [4]
(iv) Derive the Yule-Walker equations for the model and hence derive estimates o
f
a and s2 based on observed values of the autocovariance function. [5]
(v) Comment on the difference between the estimates of a in parts (iii) and (iv)
.
[1]
[Total 15]

(i) State the three main stages in the Box-Jenkins approach to fitting an ARIMA
time series model. [3]
(ii) Explain, with reasons, which ARIMA time series would fit the observed data
in the charts below. [2]
Now consider the time series model given by
Xt = a1Xt-1 + a2Xt-2 + 1et-1 + et
where et is a white noise process with variance s2.
(iii) Derive the Yule-Walker equations for this model. [6]
(iv) Explain whether the partial auto-correlation function for this model can ev
er
give a zero value. [2]
[Total 13]

A sequence of 100 observations was made from a time series and the following val
ues
of the sample auto-covariance function (SACF) were observed:
Lag SACF
1 0.68
2 0.55
3 0.30
4 0.06
The sample mean and variance of the same observations are 1.35 and 0.9 respectiv
ely.
(i) Calculate the first two values of the partial correlation function 1 f and f
. [1]
(ii) Estimate the parameters (including s2) of the following models which are to
be fitted to the observed data and can be assumed to be stationary.
(a) Yt = a0 + a1 Yt-1 + et
(b) Yt = a0 + a1 Yt-1 + a2 Yt-2 + et
In each case et is a white noise process with variance s2. [12]
(iii) Explain whether the assumption of stationarity is necessary for the estima

tion
for each of the models in part (ii). [2]
(iv) Explain whether each of the models in part (ii) satisfies the Markov proper
ty.
[2]

(i) List the main steps in the Box-Jenkins approach to fitting an ARIMA time
series to observed data. [3]
Observations x1, x2, , x200 are made from a stationary time series and the follow
ing
summary statistics are calculated:
200 200 200
2
1 1 2 1
i 83.7 ( i ) 35.4 ( i )( i ) 28.4 i i i
x x x x x x - x
= = =
S = S - = S - - =
200
3 2
( i )( i ) 17.1 i
x x x - x
=
S - - =
(ii) Calculate the values of the sample auto-covariances ? 0, ? 1 and ? 2. [3]
(iii) Calculate the first two values of the partial correlation function 1 f and
2 f .
[3]
The following model is proposed to fit the observed data:
Xt - = a1 (Xt-1 - ) + et
where et is a white noise process with variance s2.
(iv) Estimate the parameters , a1 and s2 in the proposed model. [5]
After fitting the model in part (iv) the 200 observed residual values e t were
calculated. The number of turning points in the residual series was 110.
(v) Carry out a statistical test at the 95% significance level to test the hypot
hesis
that e t is generated from a white noise process. [4]

Yt, t = 1, 2, 3, is a time series defined by


Yt 0.8Yt 1 = Zt + 0.2Zt 1
where Zt, t = 0, 1, is a sequence of independent zero-mean variables with common
variance 2.
Derive the autocorrelation k, k = 0, 1, 2, [6]

Y1, Y2, , Yn are independent random variables, and


Yi ~ Poisson, mean i.
The fitted values for a particular model are denoted by i . Derive the form of t

he
scaled deviance. [5]

The following time series model is used for the monthly inflation rate (Yt) in a
particular country:
Yt = 0.4Yt 1 + 0.2Yt 2 + Zt + 0.025
where {Zt} is a sequence of uncorrelated identically distributed random variable
s
whose distributions are normal with mean zero.
(i) Derive the values of p, d and q, when this model is considered as an
ARIMA(p, d, q) model. [3]
(ii) Determine whether {Yt} is a stationary process. [2]
(iii) Assuming an infinite history, calculate the expected value of the rate of
inflation over this. [1]
iv) Calculate the autocorrelation function of {Yt}. [5]
(v) Explain how the equivalent infinite-order moving average representation of
{Yt} may be derived. [2]

(i) Derive the autocovariance and autocorrelation functions of the AR(1) process
Xt = Xt 1 + et
where < 1 and the et form a white noise process. [4]
(ii) The time series Zt is believed to follow a ARIMA(1, d, 0) process for some
value of d. The time series (k )
Zt is obtained by differencing k times and the
sample autocorrelations, {ri : i = 1, , 10}, are shown in the table below for
various values of k.
k = 0 k = 1 k = 2 k = 3 k = 4 k = 5
r1 100% 100% 83% 3% 45% 64%
r2 100% 100% 66% 12% 5% 13%
r3 100% 100% 54% 11% 4% 3%
r4 100% 99% 45% 1% 6% 4%
r5 100% 99% 37% 3% 4% 5%
r6 100% 99% 30% 12% 12% 12%
r7 99% 98% 27% 3% 7% 9%
r8 99% 98% 24% 3% 0% 4%
r9 99% 97% 19% 3% 5% 6%
r10 99% 97% 13% 7% 5% 4%
Suggest, with reasons, appropriate values for d and the parameter in the
underlying AR(1) process. [4]

State the Markov property and explain briefly whether the following processes ar
e
Markov:
AR(4);
ARMA (1, 1).

A modeller has attempted to fit an ARMA(p,q) model to a set of data using the Bo
xJenkins methodology. The plot of residuals based on this proposed fit is shown
below.
(i) Under the assumptions of the model, the residuals should form a white noise
process.
(a) By inspection of the chart, suggest two reasons to suspect that the
residuals do not form a white noise process.
(b) Define what is meant by a turning point.
(c) Perform a significance test on the number of turning points in the data
above. (There are 100 points in the data and 59 turning points.)
[6]
(ii) On your suggestion, the original fitted model is discarded, and reparameter
ised
to:
Xn+2 = 5 + 0.9(Xn+1 -5) + en+2 + 0.5en .
Given the following observations:
99 100
99 100
= 2, = 7
= 0.7,
=1.4
X X
e - e
Use the Box-Jenkins methodology to calculate the forward estimates
X100 (1), X100 (2) and X100 (3) . [4]

The time series Xt is assumed to be stationary and to follow an ARMA (2,1) proce
ss
defined by:
Xt = 1 + 8
15
Xt-1 1
15
Xt-2 + ?t
1
7 t-1 ?
where Zt are independent N(0,1) random variables.
(i) Determine the roots of the characteristic polynomial, and explain how their
values relate to the stationarity of the process. [2]
(ii) (a) Find the autocorrelation function for lags 0, 1 and 2.
(b) Derive the autocorrelation at lag k in the form
?k = k
A
c
+ k
B
d
[12]
(iii) Determine the mean and variance of Xt. [3]

Consider the following model applied to some quarterly data:


Yt = et + 1et-1 + 4et-4 + 14et-5
where et is a white noise process with mean zero and variance s2.

(i) Express in terms of 1 and 4 the roots of the characteristic polynomial of the
MA part, and give conditions for invertibility of the model. [2]
(ii) Derive the autocorrelation function (ACF) for Yt. [5]
For our particular data the sample ACF is:
Lag ACF
1 0.73
2 0.14
3 0.37
4 0.59
5 0.24
6 0.12
7 0.07
(iii) Explain whether these results confirm the initial belief that the model co
uld be
appropriate for these data. [3]
[Total 10]

(i) Describe the difference between strictly stationary processes and weakly
stationary processes. [2]
(ii) Explain why weakly stationary multivariate normal processes are also strict
ly
stationary. [1]
(iii) Show that the following bivariate time series process, (Xn, Yn)t , is weak
ly
stationary:
Xn = 0.5Xn-1 + 0.3Yn-1 + x
en
Yn = 0.1Xn-1 + 0.8Yn-1 + y
en
where x
en and x
en are two independent white noise processes. [5]
(iv) Determine the positive values of c for which the process
Xn = (0.5 + c) Xn-1 + 0.3Yn-1 + x
en
Yn = 0.1Xn-1 + (0.8 + c) Yn-1 + y
en
is stationary. [6]
[Total 14]

Consider the ARCH(1) process


Xt = + et
2
a0 + a1(Xt-1 -)
where et are independent normal random variables with variance 1 and mean 0.
Show that, for s = 1, 2,
, t-1, Xt and Xt-s are:
(i) uncorrelated. [5]
(ii) not independent. [3]
[Total 8

From a sample of 50 consecutive observations from a stationary process, the tabl


e
below gives values for the sample autocorrelation function (ACF) and the sample
partial autocorrelation function (PACF):
Lag ACF PACF
1 0.854 0.854
2 0.820 0.371
3 0.762 0.085
The sample variance of the observations is 1.253.
(i) Suggest an appropriate model, based on this information, giving your
reasoning. [2]
(ii) Consider the AR(1) model
Yt = a1Yt-1 + et,
where et is a white noise error term with mean zero and variance s2.
Calculate method of moments (Yule-Walker) estimates for the parameters of
a1 and s2 on the basis of the observed sample. [4]
(iii) Consider the AR(2) model
Yt = a1Yt-1 + a2Yt-2 + et,
where et is a white noise error term with mean zero and variance s2.
Calculate method of moments (Yule-Walker) estimates for the parameters of
a1 , a2 and s2 on the basis of the observed sample. [7]
(iv) List two statistical tests that you should apply to the residuals after fit
ting a
model to time series data. [2]
[Total 15]

Let Yt be a stationary time series with autocovariance function ?Y (s) .


(i) Show that the new series Xt = a + bt +Yt where a and b are fixed non-zero
constants, is not stationary. [2]
(ii) Express the autocovariance function of ?Xt = Xt - Xt-1 in terms of ?Y (s) a
nd
show that this new series is stationary. [7]
(iii) Show that if Yt is a moving average process of order 1, then the series ?X
t is
not invertible and has variance larger than that of Yt. [6]
[Total 15]

Consider the stationary autoregressive process of order 1 given by


Yt = 2aYt-1 + Zt , a < 0.5
where Zt denotes white noise with mean zero and variance s2.
Express Yt in the form
0
t j t j
j
Y aZ
8
=
= S and hence or otherwise find an expression for
the variance of Yt in terms of a and s. [4]

The following data is observed from n = 500 realisations from a time series:
1
13153.32
n
i
i
x
=
S = , 2
1
( ) 3153.67
n
i
i
x x
=
S - = and
1
1
1
( )( ) 2176.03
n
i i
i
x x x x
+
=
S - - = .
(i) Estimate, using the data above, the parameters , a1 and s from the model
Xt - = a1(Xt-1 - ) + et
where et is a white noise process with variance s2. [7]
(ii) After fitting the model with the parameters found in (i), it was calculated
that
the number of turning points of the residuals series e t is 280.
Perform a statistical test to check whether there is evidence that e t is not
generated from a white noise process. [3]
[Total 10]

The following two models have been suggested for representing some quarterly dat
a
with underlying seasonality.
Model 1 Yt = aYt-4 + et
Model 2 Yt = et-4 + et
Where et is a white noise process in each case.
(i) Determine the auto-correlation function for each model. [4]
The observed quarterly data is used to calculate the sample auto-correlation.
(ii) State the features of the sample auto-correlation that would lead you to pr
efer
Model 1. [1]
[Total 5]

You might also like