Professional Documents
Culture Documents
Series
Models
DJS 2/20/15
Topic
Stochastic
processes
s
Stationarity
White noise
Random walk
Moving average processes
Autoregressive processes
More general processes
DJS 2/20/15
Stochastic
Processes
DJS 2/20/15
Stochastic
Time
series are an example of a
processes
Stochastic
We
are concerned only with
processes
Infere
We
base our inference usually on
nce
a single observation or
realization of the process over
some period of time, say [0, T] (a
continuous interval of time) or at
a sequence of time points {0, 1,
2, . . . T}
DJS 2/20/15
Specification of a
To describe a stochastic process
process
DJS 2/20/15
Specification of a
process
A simpler approach is to only
Autocovaria
Because
nce the random variables
comprising the process are not
independent, we must also specify
their covariance
Cov( X , X )
t1 ,t2
DJS 2/20/15
t1
t2
Stationar
ity
DJS 2/20/15
10
Stationar
ity
Inference is most easy, when a
process is stationaryits
distribution does not change
over time
This is strict stationarity
A process is weakly stationary if
its mean and autocovariance
functions do not change over
time
DJS 2/20/15
11
Weak
stationarity
The autocovariance depends
only on the time difference or
lag between the two time
t , t2 2
points involved
and
t1 ,t2 Cov( Xt1 , Xt2 ) Cov(Xt1 , Xt2 )
t1 ,t2 t1 t2
DJS 2/20/15
12
Autocorrelat
It is useful to standardize the
ion
13
Autocorrelat
ion
More than one process can have
the same acf
Properties are:
0 1
t t
t 1
DJS 2/20/15
14
White
Noise
DJS 2/20/15
15
White
This
is a purely random process, a
noise
16
Random
Walk
DJS 2/20/15
17
Random
walk
Start with {Z } being white noise
t
or purely random
Xt Xt 1 Zt Zt
k0
DJS 2/20/15
18
Random
The random walk is not stationary
walk
E( Xt ) t , Var(Xt ) t 2
19
Moving
Average
Processes
DJS 2/20/15
20
Moving average
Start
with {Z } being white noise or
processes
t
21
Moving average
The
mean and variance are given
processes
by
E( Xt ) 0, Var(Xt )
2Z
k2
k0
Moving average
If
the Z 's are normal then so is the
processes
t
k k 0
DJS 2/20/15
23
Moving average
Note
the autocorrelation cuts off at
processes
lag q
24
Moving average
In order to ensure there is a
processes
Moving average
For
general processes introduce the
processes
backward shift operator B
B j Xt Xt j
26
Moving average
The general condition for
processes
DJS 2/20/15
27
Autoregres
sive
Processes
DJS 2/20/15
28
Autoregressive
Assume
{Z } is purely random
processes
t
DJS 2/20/15
29
Autoregressive
The first order autoregression is
processes
Xt = Xt - 1 + Zt
DJS 2/20/15
30
Autoregressive
From the previous equation we
processes
have
Xt Zt / (1 B)
(1 B 2B2 ...)Zt
Zt Zt1 2 Zt2 ...
DJS 2/20/15
31
Autoregressive
Then E(X ) = 0, and if ||<1
processes
t
2
Var (Xt ) X
Z2 / (1 2 )
k k Z2 / (1 2 )
k k
DJS 2/20/15
32
Autoregressive
The AR(p) process can be written
processes
as
(1 1B 2 B2 ... p Bp )Xt Zt
or
Xt Zt / (1 1B 2 B2 ... p Bp ) f (B)Zt
DJS 2/20/15
33
Autoregressive
processes
This is for
f(B) (1 1B... p Bp )1
(1 1B 2 B2 ...)
for some 1, 2, . . .
This gives Xt as an infinite MA
process, so it has mean zero
DJS 2/20/15
34
Autoregressive
Conditions
are needed to ensure
processes
35
Autoregressive
The
may not be able to be found
processes
i
however.
y
... p 0
auxiliary equation
1
DJS 2/20/15
36
Autoregressive
Then
a necessary and sufficient
processes
37
ARMA
Combine AR and MA processes
processes
DJS 2/20/15
38
ARMA
Alternative expressions are
processes
(B) 1 1B... p Bp
(B) 1 1B... qBq
DJS 2/20/15
39
ARMA
An
ARMA process can be written in
processes
pure MA or pure AR forms, the
operators being possibly of infinite
order
Xt (B)Zt
(B)Xt Zt
40
ARIMA
General autoregressive integrated
processes
41
ARIMA
Alternatively
specify the process as
processes
(B)W t (B)Zt
or
(B)(1 B) d Xt (B)Zt
This is an ARIMA process of order
(p,d,q)
DJS 2/20/15
42
ARIMA
processes
The
model for X is non-stationary
t
43
Non-zero
We have assumed that the mean
mean
44
The BoxJenkins
Approach
DJS 2/20/15
45
Topic
Outline
of the approach
s
46
Outline of the
Box-Jenkins
Approach
DJS 2/20/15
47
Box-Jenkins
The
approach is an iterative one
approach
involving
model identification
model fitting
model checking
If the model checking reveals that
there are problems, the process is
repeated
DJS 2/20/15
48
Mode
Models
to be fitted are from the
ls
ARIMA class of models (or SARIMA
49
Autocorrelat
Use
ionthe sample autocovariance
ck Xt XXt k X N
t 1
DJS 2/20/15
50
Autocovarian
The sample autocovariances are
ces
Autocovarian
ces
Can use a different divisor (N-k
52
Autocorrelat
More
iondifficult to obtain properties of
sample autocorrelation
Generally still biased
When process is white noise
E(rk) 1/N
Var( rk) 1/N
Correlations are normal for N large
DJS 2/20/15
53
Autocorrelat
Gives a rough test of whether an
ion
autocorrelation is non-zero
If |rk|>2/(N) suspect the
autocorrelation at that lag is nonzero
Note that when examining many
autocorrelations the chance of
falsly identifying a non-zero one
increases
54
DJS 2/20/15
Partial
Broadly speaking the partial
autocorrelation
Model
identification
Plot the autocorrelations and
DJS 2/20/15
56
Stationary
For a MA(q) process the
series
57
Stationary
series
For mixed ARMA processes, both
Non-stationary
The
existence of non-stationarity
series
Estimati
on
DJS 2/20/15
60
Estimati
on
We will always fit the model
using Minitab
AR models may be fitted by least
squares, or by solving the YuleWalker equations
MA models require an iterative
procedure
ARMA models are like MA
models
61
DJS 2/20/15
Minita
b
Minitab
uses an iterative least
62
Diagnostic
Checking
DJS 2/20/15
63
Diagnostic
checking
Based on residuals
Residuals should be Normally
distributed have zero mean, by
uncorrelated, and should have
minimum variance or dispersion
DJS 2/20/15
64
Procedu
Plot residuals against time
res
Draw histogram
Obtain normal scores plot
Plot acf and pacf of residuals
Plot residuals against fitted
values
Note that residuals are not
uncorrelated, but are
DJS 2/20/15
approximately
so at long lags
65
Procedu
res
Portmanteau test
Overfitting
DJS 2/20/15
66
Portmanteau
Box
and Peirce proposed a statistic
test
DJS 2/20/15
67
Portmanteau
test
Box & Ljung discovered that the
test was not good unless n was
very large
Instead use modified Box-Pierce
or Ljung-Box-Pierce statistic
reject model if Q* isK too2 large
rk
*
Q N N 2
k 1N k
DJS 2/20/15
68
Overfitti
Suppose
we think an AR(2) model
ng
69
Further
Ideas
DJS 2/20/15
70
Other identification
Chatfield(1979),
JRSS A among
tools
AI
The Akaike Information Criterion
C
72
Parsimo
Once principal generally accepted
ny
73
Parsimo
AR
models are easier to fit so there
ny
74
Exponential
Most
exponential smoothing
smoothing
Exponential
For
example simple exponential
smoothing
DJS 2/20/15
76