You are on page 1of 76

Time

Series
Models
DJS 2/20/15

Topic
Stochastic
processes
s

Stationarity
White noise
Random walk
Moving average processes
Autoregressive processes
More general processes
DJS 2/20/15

Stochastic
Processes
DJS 2/20/15

Stochastic
Time
series are an example of a
processes

stochastic or random process


A stochastic process is 'a statistical
phenomenen that evolves in
timeaccording to probabilistic laws'
Mathematically, a stochastic process
is an indexed collection of random
variables
Xt :t T
DJS 2/20/15

Stochastic
We
are concerned only with
processes

processes indexed by time, either


discrete time or continuous time
processes such as
Xt :t (,) Xt : t
or
Xt :t {0,1,2,...} X0 , X1 , X2 ,...
DJS 2/20/15

Infere
We
base our inference usually on
nce

a single observation or
realization of the process over
some period of time, say [0, T] (a
continuous interval of time) or at
a sequence of time points {0, 1,
2, . . . T}
DJS 2/20/15

Specification of a
To describe a stochastic process
process

fully, we must specify the all


finite dimensional distributions,
i.e. the joint distribution of of the
random variables for any finite
set of times
Xt1 , Xt2 , Xt3 ,.... Xtn

DJS 2/20/15

Specification of a
process
A simpler approach is to only

specify the momentsthis is


sufficient if all the joint
distributions are normal
The mean and variance functions
are given by
t E(Xt ), and t2 Var( Xt )
DJS 2/20/15

Autocovaria
Because
nce the random variables
comprising the process are not
independent, we must also specify
their covariance

Cov( X , X )
t1 ,t2

DJS 2/20/15

t1

t2

Stationar
ity
DJS 2/20/15

10

Stationar
ity
Inference is most easy, when a

process is stationaryits
distribution does not change
over time
This is strict stationarity
A process is weakly stationary if
its mean and autocovariance
functions do not change over
time
DJS 2/20/15

11

Weak
stationarity
The autocovariance depends
only on the time difference or
lag between the two time
t , t2 2
points involved
and
t1 ,t2 Cov( Xt1 , Xt2 ) Cov(Xt1 , Xt2 )
t1 ,t2 t1 t2
DJS 2/20/15

12

Autocorrelat
It is useful to standardize the
ion

autocovariance function (acvf)


Consider stationary case only
Use the autocorrelation function
(acf)
t
t
0
DJS 2/20/15

13

Autocorrelat
ion
More than one process can have
the same acf
Properties are:

0 1
t t
t 1
DJS 2/20/15

14

White
Noise
DJS 2/20/15

15

White
This
is a purely random process, a
noise

sequence of independent and


identically distributed random
variables
Has constant mean and variance
k Cov(Zt , Zt k ) 0, k 0
Also
1k 0
k
0k 0
DJS 2/20/15

16

Random
Walk
DJS 2/20/15

17

Random
walk
Start with {Z } being white noise
t

or purely random

{Xt} is a random walk if


Xo 0

Xt Xt 1 Zt Zt
k0

DJS 2/20/15

18

Random
The random walk is not stationary
walk
E( Xt ) t , Var(Xt ) t 2

First differences are stationary


Xt Xt Xt1 Zt
DJS 2/20/15

19

Moving
Average
Processes
DJS 2/20/15

20

Moving average
Start
with {Z } being white noise or
processes
t

purely random, mean zero, s.d. Z

{Xt} is a moving average process of


order q (written MA(q)) if for some
constants 0, 1, . . . q we have
Xt 0 Zt 1 Zt1 ... q Ztq
Usually 0 =1
DJS 2/20/15

21

Moving average
The
mean and variance are given
processes
by

E( Xt ) 0, Var(Xt )

2Z

k2

k0

The process is weakly stationary


because the mean is constant and
the covariance does not depend on
22
t DJS 2/20/15

Moving average
If
the Z 's are normal then so is the
processes
t

process, and it is then strictly


stationary
0
The autocorrelation
qk 1k
is
q
i ik i 2 k 1,..., q
k i 0
i0
0k q

k k 0
DJS 2/20/15

23

Moving average
Note
the autocorrelation cuts off at
processes
lag q

For the MA(1) process with 0 = 1


1k 0
k 1 (1 12 )k = 1
0otherwise
DJS 2/20/15

24

Moving average
In order to ensure there is a
processes

unique MA process for a given acf,


we impose the condition of
invertibility
This ensures that when the
process is written in series form,
the series converges
For the MA(1) process Xt = Zt +
25
DJS 2/20/15
Z
t - 1, the condition is ||< 1

Moving average
For
general processes introduce the
processes
backward shift operator B
B j Xt Xt j

Then the MA(q) process is given by


Xt ( 0 1 B 2 B2 ... q Bq )Zt (B)Zt
DJS 2/20/15

26

Moving average
The general condition for
processes

invertibility is that all the roots of


the equation lie outside
the unit circle (have modulus less
than one)

DJS 2/20/15

27

Autoregres
sive
Processes
DJS 2/20/15

28

Autoregressive
Assume
{Z } is purely random
processes
t

with mean zero and s.d. z


Then the autoregressive process
of order p or AR(p) process is
Xt 1 Xt 1 2 Xt 2 ... p Xt p Zt

DJS 2/20/15

29

Autoregressive
The first order autoregression is
processes
Xt = Xt - 1 + Zt

Provided ||<1 it may be written


as an infinite order MA process
Using the backshift operator we
have
(1 B)Xt = Zt

DJS 2/20/15

30

Autoregressive
From the previous equation we
processes
have
Xt Zt / (1 B)

(1 B 2B2 ...)Zt
Zt Zt1 2 Zt2 ...

DJS 2/20/15

31

Autoregressive
Then E(X ) = 0, and if ||<1
processes
t

2
Var (Xt ) X
Z2 / (1 2 )

k k Z2 / (1 2 )
k k
DJS 2/20/15

32

Autoregressive
The AR(p) process can be written
processes

as
(1 1B 2 B2 ... p Bp )Xt Zt
or

Xt Zt / (1 1B 2 B2 ... p Bp ) f (B)Zt

DJS 2/20/15

33

Autoregressive
processes
This is for
f(B) (1 1B... p Bp )1
(1 1B 2 B2 ...)

for some 1, 2, . . .
This gives Xt as an infinite MA
process, so it has mean zero
DJS 2/20/15

34

Autoregressive
Conditions
are needed to ensure
processes

that various series converge, and


hence that the variance exists,
and the autocovariance can be
defined
Essentially these are requirements
that the i become small quickly
enough, for large i
DJS 2/20/15

35

Autoregressive
The
may not be able to be found
processes
i

however.

The alternative is to work with the


i
The acf is expressible in terms of
the roots i, i=1,2, ...p of the
p
p1
y

y
... p 0
auxiliary equation
1
DJS 2/20/15

36

Autoregressive
Then
a necessary and sufficient
processes

condition for stationarity is that for


every i, |i|<1
An equivalent way of expressing
this is that the roots of the
equation
(B) 1 1B... p Bp 0
must lie outside the unit circle
DJS 2/20/15

37

ARMA
Combine AR and MA processes
processes

An ARMA process of order (p,q) is


given by
Xt 1Xt1 ... p Xt p
Zt 1Zt1 ... p Zt q

DJS 2/20/15

38

ARMA
Alternative expressions are
processes

possible using the backshift


operator
(B)Xt (B)Zt
where

(B) 1 1B... p Bp
(B) 1 1B... qBq
DJS 2/20/15

39

ARMA
An
ARMA process can be written in
processes
pure MA or pure AR forms, the
operators being possibly of infinite
order
Xt (B)Zt
(B)Xt Zt

Usually the mixed form requires


DJS 2/20/15
fewer
parameters

40

ARIMA
General autoregressive integrated
processes

moving average processes are


called ARIMA processes
When differenced say d times, the
process is an ARMA process
Call the differenced process Wt.
Then Wt is an ARMA process and
Wt d Xt (1 B)d Xt
DJS 2/20/15

41

ARIMA
Alternatively
specify the process as
processes

(B)W t (B)Zt
or
(B)(1 B) d Xt (B)Zt
This is an ARIMA process of order
(p,d,q)
DJS 2/20/15

42

ARIMA
processes
The
model for X is non-stationary
t

because the AR operator on the


left hand side has d roots on the
unit circle
d is often 1
Random walk is ARIMA(0,1,0)
Can include seasonal termssee
later
DJS 2/20/15

43

Non-zero
We have assumed that the mean
mean

is zero in the ARIMA models


There are two alternatives
mean correct all the Wt terms
in the model
incorporate a constant term in
the model
DJS 2/20/15

44

The BoxJenkins
Approach
DJS 2/20/15

45

Topic
Outline
of the approach
s

Sample autocorrelation & partial


autocorrelation
Fitting ARIMA models
Diagnostic checking
Example
Further ideas
DJS 2/20/15

46

Outline of the
Box-Jenkins
Approach
DJS 2/20/15

47

Box-Jenkins
The
approach is an iterative one
approach

involving
model identification
model fitting
model checking
If the model checking reveals that
there are problems, the process is
repeated
DJS 2/20/15

48

Mode
Models
to be fitted are from the
ls
ARIMA class of models (or SARIMA

class if the data are seasonal)


The major tools in the
identification process are the
(sample) autocorrelation function
and partial autocorrelation
function
DJS 2/20/15

49

Autocorrelat
Use
ionthe sample autocovariance

and sample variance to estimate


the autocorrelation
The obvious estimator of the
autocovariance is
N k

ck Xt XXt k X N
t 1

DJS 2/20/15

50

Autocovarian
The sample autocovariances are
ces

not unbiased estimates of the


autocovariancesbias is of order
1/N
Sample autocovariances are
correlated, so may display
smooth ripples at long lags which
are not in the actual
autocovariances
51
DJS 2/20/15

Autocovarian
ces
Can use a different divisor (N-k

instead of N) to decrease bias


but may increase mean square
error
2
Can use jacknifing to reduce bias
(to order 1/N )divide the sample
in half and estimate using the
whole and both halves
DJS 2/20/15

52

Autocorrelat
More
iondifficult to obtain properties of

sample autocorrelation
Generally still biased
When process is white noise
E(rk) 1/N
Var( rk) 1/N
Correlations are normal for N large
DJS 2/20/15

53

Autocorrelat
Gives a rough test of whether an
ion

autocorrelation is non-zero
If |rk|>2/(N) suspect the
autocorrelation at that lag is nonzero
Note that when examining many
autocorrelations the chance of
falsly identifying a non-zero one
increases
54
DJS 2/20/15

Partial
Broadly speaking the partial
autocorrelation

autocorrelation is the correlation


between Xt and Xt+k with the
effect of the intervening variables
removed
Sample partial autocorrelations
are found from sample
autocorrelations by solving a set
ofDJSequations
known as the Yule- 55
2/20/15
Walker equations

Model
identification
Plot the autocorrelations and

partial autocorrelations for the


series
Use these to try and identify an
appropriate model
Consider stationary series first

DJS 2/20/15

56

Stationary
For a MA(q) process the
series

autocorrelation is zero at lags


greater than q, partial
autocorrelations tail off in
exponential fashion
For an AR(p) process the partial
autocorrelation is zero at lags
greater than p, autocorrelations
tail off in exponential fashion
DJS 2/20/15

57

Stationary
series
For mixed ARMA processes, both

the acf and pacf will have large


values up to q and p respectively,
then tail off in an exponential
fashion
See graphs in M&W, pp. 136137
Try fitting a model and examine
the residuals is the approach
used
58
DJS 2/20/15

Non-stationary
The
existence of non-stationarity
series

is indicated by an acf which is


large at long lags
Induce stationarity by differencing
Differencing once is generally
sufficient, twice may be needed
Overdifferencing introduces
autocorrelation & should be
avoided
59
DJS 2/20/15

Estimati
on
DJS 2/20/15

60

Estimati
on
We will always fit the model

using Minitab
AR models may be fitted by least
squares, or by solving the YuleWalker equations
MA models require an iterative
procedure
ARMA models are like MA
models
61
DJS 2/20/15

Minita
b
Minitab
uses an iterative least

squares approach to fitting ARMA


models
Standard errors can be calculated
for the parameter estimates so
confidence intervals and tests of
significance can be carried out
DJS 2/20/15

62

Diagnostic
Checking
DJS 2/20/15

63

Diagnostic
checking
Based on residuals
Residuals should be Normally
distributed have zero mean, by
uncorrelated, and should have
minimum variance or dispersion

DJS 2/20/15

64

Procedu
Plot residuals against time
res

Draw histogram
Obtain normal scores plot
Plot acf and pacf of residuals
Plot residuals against fitted
values
Note that residuals are not
uncorrelated, but are
DJS 2/20/15
approximately
so at long lags

65

Procedu
res
Portmanteau test
Overfitting

DJS 2/20/15

66

Portmanteau
Box
and Peirce proposed a statistic
test

which tests the magnitudes of the


residual autocorrelations as a group
Their test was to compare Q below
with the Chi-Square with K p q
d.f. when fitting an ARMA(p, q)
K
model
Q N rk2
k 1

DJS 2/20/15

67

Portmanteau
test
Box & Ljung discovered that the
test was not good unless n was
very large
Instead use modified Box-Pierce
or Ljung-Box-Pierce statistic
reject model if Q* isK too2 large
rk
*
Q N N 2
k 1N k
DJS 2/20/15

68

Overfitti
Suppose
we think an AR(2) model
ng

is appropriate. We fit an AR(3)


model.
The estimate of the additional
parameter should not be
significantly different from zero
The other parameters should not
change much
This
is an example of overfitting
DJS 2/20/15

69

Further
Ideas
DJS 2/20/15

70

Other identification
Chatfield(1979),
JRSS A among
tools

others has suggested the use of the


inverse autocorrelation to assist
with identification of a suitable
model
Abraham & Ledolter (1984)
Biometrika show that although this
cuts off after lag p for the AR(p)
model it is less effective than the
DJS 2/20/15
partial
autocorrelation for detecting71

AI
The Akaike Information Criterion
C

is a function of the maximum


likelihood plus twice the number
of parameters
The number of parameters in the
formula penalizes models with
too many parameters
DJS 2/20/15

72

Parsimo
Once principal generally accepted
ny

is that models should be


parsimonioushaving as few
parameters as possible
Note that any ARMA model can be
represented as a pure AR or pure
MA model, but the number of
parameters may be infinite
DJS 2/20/15

73

Parsimo
AR
models are easier to fit so there
ny

is a temptation to fit a less


parsimonious AR model when a
mixed ARMA model is appropriate
Ledolter & Abraham (1981)
Technometrics show that fitting
unnecessary extra parameters, or
an AR model when a MA model is
appropriate, results in loss of
DJS 2/20/15
forecast
accuracy

74

Exponential
Most
exponential smoothing
smoothing

techniques are equivalent to fitting


an ARIMA model of some sort
Winters' multiplicative seasonal
smoothing has no ARIMA
equivalent
Winters' additive seasonal
smoothing has a very nonparsimonious
ARIMA equivalent 75
DJS 2/20/15

Exponential
For
example simple exponential
smoothing

smoothing is the optimal method


of fitting the ARIMA (0, 1, 1)
process
Optimality is obtained by taking
the smoothing parameter to be
1 when
(1 the
B)Xmodel
(1 is
B)Z
t

DJS 2/20/15

76

You might also like