You are on page 1of 27

Forecasting (prediction) limits

Example Linear deterministic trend estimated by least-squares

Yt 0 t t et
Yt l E Yt l Y1 , , Yt b0 b1 t l b0 Y1 , , Yt b1 Y1 , , Yt t l
et l Yt l Yt l
Var et l Yt l and Yt l independent Var Yt l Var Yt l
t 1
2

t l
1 2
From regression analysis theory e2 e2 2
t t t 1



s1 s
2



t 1
2

t l Note! The average of the numbers 1, 2, , t is
1 2
e2 1 2 1 1 t t 1 t 1
t t 1 s 1 s
t
t



s1 s
2



t t 2 2
Hence, calculated prediction limits for Yt+l become

t 1
t l
1 2
Yt l c e 1
2
t t t 1
s 1 s
2

where c is a quantile of a proper sampling distribution emerging from the use of


e2 as an estimator of e2 and the requested coverage of the limits.

For t large it suffices to use the standard normal distribution and a good
approximation is also obtained even if the term
t 1
t l
1 2
2
t t t 1
s1 s 2 Yt l z 2 e
is omitted under the square root Pr N 0,1 z 2 2
ARIMA-models


l 1
Yt l z 2 e j 0
2
j

where 0 , , l 1 are functions of the


parameter estimates , , and , ,
1 p 1 q

Using R
ts=arima(x,) for fitting models

plot.Arima(ts,) for plotting fitted models with 95%


prediction limits

See documentation for plot.Arima . However, the generic command plot


can be used.

forecast.Arima Install and load package forecast. Gives


more flexibility with respect to prediction
limits.
Seasonal ARIMA models
Example beersales data

A clear seasonal pattern and also a trend, possibly a quadratic trend


Residuals from detrended data

beerq<-lm(beersales~time(beersales)+I(time(beersales)^2))
plot(y=rstudent(beerq),x=as.vector(time(beersales)),type="b",
pch=as.vector(season(beersales)),xlab="Time")

Seasonal pattern, but possibly no long-term trend left


SAC and SPAC of the residuals:

SAC

Spikes at or close
to seasonal lags
(or half-seasonal
lags)

SPAC
Modelling the autocorrelation at seasonal lags

Pure seasonal variation:

Yt 1 Yt 12 et Seasonal AR(1)12 - model


Stationary if 1 1 Roots to the characteristic equation 1 12
1 0

outside the unit circle


1k 12 k 0 ,12 ,24 ,36,...
k
0 otherwise
Yt et 1 et 12 Seasonal MA(1)12 - model
Invertible if 1 1 Roots to the characteristic equation 1 12
1 0

outside the unit circle


1 k 0

k 1
k 12
1 2
1
0 otherwise
Non-seasonal and seasonal variation:

AR(p, P)s or ARMA(p,0)(P,0)s

Yt 1 Yt 1 p Yt p 1 Yt s P Yt Ps et

However, we cannot discard that the non-seasonal and seasonal variation


interact Better to use multiplicative Seasonal AR Models

1 B B 1 B
1 p
p
1
s
PB Ps
Y e
t t

Example:

1 0.3 B 1 0.2 B12 Yt et



1 0.3 B 0.2 B 0.3 0.2 B Yt et
12 13

Yt 0.3 Yt 1 0.2 Yt 12 0.05 Yt 13 et
Multiplicative MA(q, Q)s or ARMA(0,q)(0,Q)s

Yt 1 1 B q B q 1 1 B s Q B Qs et

Mixed models:

1 B B 1 B B Y
1 p
p
1
s
P
Ps
t

1 B B 1 B B e
1 q
q
1
s
Q
Q s
t

Many terms! Condensed expression:

B B s Yt B B s et
B 1 i 1i B i ; B s 1 i 1 i B s
p P i

ARMA p, q P, Q s
B s 1 j 1 j B j ; B 1 j 1 j B s
q Q j
Non-stationary Seasonal ARIMA models

Non-stationary at non-seasonal level:

d Yt Yt 1 B Yt
d
Model dth order regular differences:

Non-stationary at seasonal level:

Seasonal non-stationarity is harder to detect from a plotted times-series. The


seasonal variation is not stable.

Model Dth order seasonal differences: Y s s sYt 1 B


D
s t
s D
Yt

Example First-order monthly differences:


12Yt 1 B s Yt Yt Yt 12

can follow a stable seasonal pattern


The general Seasonal ARIMA model

B B 1 B 1 B
s D

Yt B B s et
s d

It does not matter whether regular or seasonal differences are taken first

ARIMA p, d , q P, D, Q s
Model specification, fitting and diagnostic checking

Example beersales data

Clearly non-
stationary at non-
seasonal level, i.e.
there is a long-
term trend
Investigate SAC and SPAC of original data

Many substantial spikes


both at non-seasonal and at
seasonal level-

Calls for differentiation at


both levels.
Try first-order seasonal differences first. Here: monthly data

Wt 1 B12 Yt Yt Yt 12

beer_sdiff1 <- diff(beersales,lag=12)

Look at SAC and SPAC again

Better, but now we need to try regular differences


Take first order differences in seasonally differenced data


U t 1 B 1 B12 Yt 1 B Wt Wt Wt 1 Yt Yt 12 Yt 1 Yt 13

beer_sdiff1rdiff1 <- diff(beer_sdiff1,lag=1)

Look at SAC and SPAC again

SAC starts to look good, but SPAC not


Take second order differences in seasonally differenced data

Since we suspected a non-linear long-term trend

Vt 1 B 1 B12 Yt 1 B U t U t U t 1
2

Wt Wt 1 Wt 1 Wt 2 Wt 2Wt 1 Wt 2
Yt Yt 12 2 Yt 1 Yt 13 Yt 2 Yt 14
beer_sdiff1rdiff2 <- diff(diff(beer_sdiff1,lag=1),lag=1)

Non-seasonal part Seasonal part

Could be an ARMA(2,0)(0,1)12 or an ARMA(1,1) (0,1)12


These models for original data becomes
ARIMA(2,2,0) (0,1,1)12 and ARIMA(1,2,1) (0,1,1)12

model1 <-arima(beersales,order=c(2,2,0),
seasonal=list(order=c(0,1,1),period=12))

Series: beersales
ARIMA(2,2,0)(0,1,1)[12]

Coefficients:
ar1 ar2 sma1
-1.0257 -0.6200 -0.7092
s.e. 0.0596 0.0599 0.0755

sigma^2 estimated as 0.6095: log likelihood=-216.34


AIC=438.69 AICc=438.92 BIC=451.42
Diagnostic checking can be used in a condensed way by function tsdiag. The
Ljung-Box test can specifically be obtained from function Box.test

tsdiag(model1)
standardized residuals

SPAC(standardized residuals)

P-values of Ljung-Box test with K = 24


Box.test(residuals(model1), lag = 12, type = "Ljung-Box",
fitdf = 3)
K
p+q+P+Q (how many lags
(how many degrees of freedom withdrawn from K) included)

Box-Ljung test

data: residuals(model1)
X-squared = 30.1752, df = 9, p-value = 0.0004096

For seasonal data with season length s the L-B test is usually calculated for
K = s, 2s, 3s and 4s
Box.test(residuals(model1), lag = 24, type = "Ljung-
Box", fitdf = 3)

Box-Ljung test

data: residuals(model1)
X-squared = 57.9673, df = 21, p-value = 2.581e-05
Box.test(residuals(model1), lag = 36, type = "Ljung-Box",
fitdf = 3)

Box-Ljung test

data: residuals(model1)
X-squared = 76.7444, df = 33, p-value = 2.431e-05

Box.test(residuals(model1), lag = 48, type = "Ljung-Box",


fitdf = 3)

Box-Ljung test

data: residuals(model1)
X-squared = 92.9916, df = 45, p-value = 3.436e-05
Hence, the residuals from the first model are not satisfactory

model2 <-arima(beersales,order=c(1,2,1),
seasonal=list(order=c(0,1,1),period=12))
print(model2)

Series: beersales
ARIMA(1,2,1)(0,1,1)[12]

Coefficients:
ar1 ma1 sma1
-0.4470 -0.9998 -0.6352
s.e. 0.0678 0.0176 0.0930

sigma^2 estimated as 0.4575: log likelihood=-192.86


AIC=391.72 AICc=391.96 BIC=404.45

Better fit ! But is it good?


tsdiag(model2)

Not good! We should maybe try second-order seasonal differentiation too.


Time series regression models

The classical set-up uses deterministic trend functions and seasonal indices

Yt m t S t et
Examples :
12
Yt 0 1 t s , j x j t et linear trend in monthly data
j 2

1 if t is in month j
where x j t
0 otherwise
Yt 0 1 t 2 t 2 quatadic trend, no seasonal variation

The classical set-up can be extended by allowing for autocorrelated error


terms (instead of white noise). Usually it is sufficient with and AR(1) or
AR(2). However, the trend and seasonal terms are still assumed deterministic.
Dynamic time series regression models

To extend the classical set-up with explanatory variables comprising other time
series we need another way of modelling.

Note that a stationary ARMA-model

Yt 0 1 Yt 1 p Yt p et 1 et 1 q et q

1 B
1 p B p
Y
t 0 1
1 B 1 B q
et

B Yt 0 B et

can also be written


B 0
Yt 0 et 0
B B
The general dynamic regression model for a response time series Yt with one
covariate time series Xt can be written

C B b B
Yt 0 B Xt et
B B

Special case 1:

Xt relates to some event that has occurred at a certain time points (e.g. 9/11)

It can the either be a step function


1 t T
X t T St T
0 t T
or a pulse function

1 t T
X t T Pt T
0 t T
Step functions would imply a permanent change in the level of Yt . Such a change
can further be constant or gradually increasing (depending on (B) and (B) ). It
can also be delayed (depending on b )

Pulse functions would imply a temporary change in the level of Yt . Such a


change may be just at the specific time point gradually decreasing (depending
on (B) and (B) ).

Strep and pulse functions are used to model the effects of a particular event, as
so-called intervention.
Intervention models

For Xt being a regular times series (i.e. varying with time) the models are
called transfer function models

You might also like