Professional Documents
Culture Documents
1x
Prof U Dinesh Kumar, IIMB
Forecasting
Corporate
Strategy
Business
Forecasting
Aggregate
Aggregate Production
Planning
Forecasting
Master Production
Planning
Demand
forecasting at SKU
Level
Materials
Requirement
Planning
Capacity Requirement
Planning
Forecasting
Item
Forecasting methods
Qualitative Techniques.
Expert opinion (or Astrologers)
Quantitative Techniques.
Time series techniques such as exponential smoothing
Casual Models.
Uses information about relationship between system elements
(e.g regression).
Cyclical
Seasonal
Irregular
Trend Component
Cyclical Component
Seasonal Component
Regular pattern of up and down movements.
Due to weather, customs, festivals etc.
Seasonal Vs Cyclical
When a cyclical pattern in the data has a period of less than one
year, it is referred as seasonal variation.
When the cyclical pattern has a period of more than one year we
refer to it as cyclical variation.
Irregular Component
Erratic fluctuations
Due to random variation or unforeseen events
White Noise
Demand
Demand
More
than
one
year gap
Random
movement
Less
than
one
year gap
Time
(c) Seasonal pattern
Time
(b) Cycle
Demand
Demand
Time
(a) Trend
Time
(d) Trend with seasonal pattern
Yt Tt St Ct Rt
Trend
Yt Tt St Ct Rt
Trend
Yt
St
Tt
Yt / Tt is called
deseasonalized data
Moving Average.
Exponential Smoothing.
Auto-regression Models (AR Models).
ARIMA (Auto-regressive Integrated Moving Average) Models.
Data
The forecast for period t+1 (Ft+1) is given by the average of the n most
recent data.
1 t
Ft 1
Yi
n i t n 1
Ft 1 Forecast for period t 1
Yi Data corresponding to time period i
The forecast for period t+1 (Ft+1) is given by the average of the n most
recent data.
1 t
Ft 1
Yi
n i t n 1
Ft 1 Forecast for period t 1
Yi Data corresponding to time period i
60
50
40
30
20
10
0
Actual Demand
Forecast
t
1
Ft 1
Yi
7 i t 7 1
All Rights Reserved, Indian Institute of Management Bangalore
1 n
MAE Et
n t 1
1 n Et
MAPE
n t 1 Yt
1 n 2
MSE Et
n t 1
1 n 2
RMSE
Et
n t 1
Et = Ft - Yt
All Rights Reserved, Indian Institute of Management Bangalore
0.1068 or 10.68%
5.8199
Exponential Smoothing
Exponential Smoothing
Smoothing Equations
Ft Yt 1 (1 )Yt 2 (1 ) 2 Yt 3 ....
Demand
50
40
30
20
10
0
18-09-2014
08-10-2014
28-10-2014
17-11-2014
07-12-2014
27-12-2014
16-01-2015
05-02-2015
Ft 1 0.8098 * Yt (1 0.8098) * Ft
All Rights Reserved, Indian Institute of Management Bangalore
0.0906 or 9.06%
5.3806
= 0.8098
Choice of
For smooth data, try a high value of a , forecast responsive to most
current data.
For noisy data try low a forecast more stableless responsive
Optimal
Min
Y F
t 1
/ n
Ft Yt 1 (1 ) Ft 1
Holts method
Holts method can be used to forecast when there is a linear trend present
in the data.
The method requires separate smoothing constants for slope and intercept.
Holts Method
Equation for
intercept or
level
Holts Equations
(i ) Lt Yt (1 ) ( Lt 1 Tt 1 )
(ii ) Tt ( Lt Lt 1 ) (1 ) Tt 1
Equation for
Slope (Trend)
Forecast Equation
Ft 1
Lt Tt
All Rights Reserved, Indian Institute of Management Bangalore
T1 can be set to any one of the following values (or use regression to get initial values):
T1 (Y 2Y1 )
60
50
40
30
20
10
0
0
20
40
60
Demand
80
100
120
140
Forecast
0.0930 or 9.3%
5.5052
= 0.8098; = 0.05
Forecasting Accuracy
The forecast error is the difference between the forecast value and the actual value for
the corresponding period.
E t Yt Ft
E t Forecast error at period t
Yt Actual value at time period t
Ft Forecast for time period t
1 n
MAE Et
n t 1
1 n Et
MAPE
n t 1 Yt
1 n 2
MSE Et
n t 1
1 n 2
RMSE
Et
n t 1
(Yt 1 Ft 1 )
U t 1
n
2
(Yt 1 Yt )
t 1
The value U is the relative forecasting power of the method against nave technique. If U < 1,
the technique is better than nave forecast If U > 1, the technique is no better than the nave
forecast.
1.221
Exponential Smoothing
1.704
1.0310
Theils Coefficient
U1
F
t t
t 1
2
Y
t
t 1
2
F
t
t 1
Ft 1 Yt 1
Yt
t 1
U2
2
n -1
Yt 1 Yt
Yt
t 1
n -1
U1 is bounded between 0 and 1, with values closure to zero indicating greater accuracy.
If U2 = 1, there is no difference between nave forecast and the forecasting technique
If U2 < 1, the technique is better than nave forecast
If U2 > 1, the technique is no better than the nave forecast.
All Rights Reserved, Indian Institute of Management Bangalore
Seasonal Effect
Seasonal effect
Seasonal Adjusting
Seasonal Index
Method of simple averages
Ratio-to-moving average method
Seasonal index for day i (or month i) is the ratio of daily average of day i (or
month i) to the average of daily (or monthly) averages times 100.
Week
(5-11 October)
Week 2
(12-18 October)
Week 3
(19-25 OCT)
Week 4
(26 OCT - 1 NOV)
Sunday
41.00
40.00
40.00
41.00
Monday
30.00
44.00
43.00
41.00
Tuesday
40.00
49.00
41.00
40.00
Wednesday
40.00
50.00
46.00
43.00
Thursday
40.00
45.00
41.00
46.00
Friday
40.00
40.00
40.00
45.00
Saturday
40.00
42.00
32.00
45.00
DAY
Seasonality Index
DAY
Sunday
Monday
Tuesday
Wednesday
Thursday
Friday
Saturday
Week1
(5-11 October)
41.00
30.00
40.00
40.00
40.00
40.00
40.00
Week 2
(12-18 October)
40.00
44.00
49.00
50.00
45.00
40.00
42.00
Week 3
(19-25 OCT)
40.00
43.00
41.00
46.00
41.00
40.00
32.00
Week 4
(26 Oct-1 Nov)
41.00
41.00
40.00
43.00
46.00
45.00
45.00
Average of daily
averages
Daily
Seasonality
Average
Index
40.50
97.34%
39.50
94.94%
42.50
102.15%
44.75
107.55%
43.00
103.34%
41.25
99.14%
39.75
95.54%
41.61
700
Total = Number of
seasons x 100
Deseasonalized Data
Date
Demand
10/5/2014
41
97.34%
42.12
10/6/2014
30
94.94%
31.60
10/7/2014
40
102.15%
39.16
10/8/2014
40
107.55%
37.19
10/9/2014
40
103.34%
38.70
10/10/2014
40
99.14%
40.35
10/11/2014
40
95.54%
41.87
Trend
SUMMARY OUTPUT
Regression Statistics
Multiple R
R Square
Adjusted R Square
Standard Error
Observations
0.268288
0.071978
0.036285
3.715395
28
ANOVA
Regression
Residual
Total
Intercept
Day
df
1
26
27
SS
27.83729
358.9082
386.7455
MS
27.83729
13.80416
F
2.016587
Significance F
0.167469
t Stat
27.59786
1.420066
P-value
8.7E-21
0.167469
Lower 95%
36.85166
-0.05524
Upper 95%
42.78296
0.30211
Forecast
Ft = Tt x St
F29 = T29 x S29
Auto-correlation
Yt k Y Yt Y
t k 1
n
2
(Yt Y )
n
t 1
Auto-Correlation
Partial Auto-correlation
Partial Auto-Correlation
For any k, reject H0 if | pk| > 1.96/n. Where n is the number of observations.
Stationarity
A time series is stationary, if:
Non-stationary
Stationary
All Rights Reserved, Indian Institute of Management Bangalore
White Noise
Auto-Regression
t 2
Yt Yt 1
t 2
Yt Yt 1
t 2
n
OLS
Estimate
2
Y
t 1
t 2
AR(1) Process
Yt = Yt-1 + t
Yt = (Yt-2 + t-1) + t
Yt X 0 i t i
t
i 0
Assume {Yt} is purely random with mean zero and constant standard
deviation (White Noise).
Then the autoregressive process of order p or AR(p) process is
Yt 0 1Yt 1 2Yt 2 ... pYt p t
Model
ACF
PACF
AR(1)
AR(p)
Partial Auto-Correlation
First two
PAC are
different
from
zero.
AR(2) Model
MAPE = 8.8%
All Rights Reserved, Indian Institute of Management Bangalore
Yt = (Xt - )
Yt 0 1 t 2 t 2 ... q t q t
MA(q) models each future observation as a function of
q previous errors
All Rights Reserved, Indian Institute of Management Bangalore
Model
ACF
PACF
MA(1)
MA(q)
SPSS Output
Residual Plot
ARMA(p,q) Model
AR(p)
Model
ARMA(p,q) models are not easy to identify. We usually start with pure
AR and MA process. The following thump rule may be used.
Process
ACF
PACF
ARMA(p,q)
ACF
PACF
All Rights Reserved, Indian Institute of Management Bangalore
AR(2) Model
MAPE = 8.8%
All Rights Reserved, Indian Institute of Management Bangalore
Where m is the number of variables estimated in the model and n is the number of
observations
Model
BIC
AR(2)
3.295
MA(1)
3.301
ARMA(2,1)
3.343
ARIMA
ARIMA has the following three components:
Integration (d)
ARIMA (p, d, q)
Differencing
ARIMA(p,1,q) Process
X t 0 1 X t 1 2 X t 2 ... p X t p
0 1 t 1 2 t 2 ... q t q t
Where Xt = Yt Yt-1
Ljung-Box Test
rk2
Qm n(n 2)
k 1 n k
m
Box-Jenkins Methodology
Identification: Identify the ARIMA model using ACF & PACF plots. This
would give the values of p, q and d.
Estimation: Estimate the model parameters (using maximum likelihood)
Diagnostics: Check the residual for any issue such as not providing White
Noise.