You are on page 1of 67

Demand Forecasting

Four Fundamental Approaches Time Series

General Concepts Evaluating Forecasts How good is it? Forecasting Methods (Stationary)
Cumulative Mean Nave Forecast Moving Average Exponential Smoothing

Forecasting Methods (Trends & Seasonality)


OLS Regression Holts Method Exponential Method for Seasonal Data Winters Model Other Models

Demand Forecasting
Forecasting is difficult especially for the future Forecasts are always wrong The less aggregated, the lower the accuracy The longer the time horizon, the lower the accuracy The past is usually a pretty good place to start Everything exhibits seasonality of some sort A good forecast is not just a number it should include a range, description of distribution, etc. Any analytical method should be supplemented by external information A forecast for one function in a company might not be useful to another function (Sales to Mkt to Mfg to Trans)

Four Fundamental Approaches


Subjective
Judgmental
Sales force surveys Delphi techniques Jury of experts

Objective
Causal / Relational
Econometric Models Leading Indicators Input-Output Models

Experimental
Customer surveys Focus group sessions Test Marketing

Time Series
Black Box Approach Uses past to predict the future

Time Series Concepts


Time Series Regular & recurring basis to forecast Stationarity Values hover around a mean Trend- Persistent movement in one direction Seasonality Movement periodic to calendar Cycle Periodic movement not tied to calendar Pattern + Noise Predictable and random components of a Time Series forecast 7. Generating Process Equation that creates TS 8. Accuracy and Bias Closeness to actual vs Persistent tendency to over or under predict 9. Fit versus Forecast Tradeoff between accuracy to past forecast to usefulness of predictability 10. Forecast Optimality Error is equal to the random noise 1. 2. 3. 4. 5. 6.

Evaluating Forecasts
Visual Review Errors Errors Measure MPE and MAPE Tracking Signal

Demand Forecasting
Generate the large number of short-term, SKU level, locally dis-aggregated demand forecasts required for production, logistics, and sales to operate successfully. Focus on:
Forecasting product demand Mature products (not new product releases) Short time horizon (weeks, months, quarters, year) Use of models to assist in the forecast Cases where demand of items is independent

Forecasting Terminology
400 350 300 250 200 150

Historical Data Historical Data

100 50
0 0 10 20 30 40 50

Initialization

ExPost Forecast

Forecast

Forecasting Terminology
We are now looking at a future from here, and the future we were looking at in February now includes some of our past, and we can incorporate the past into our forecast. 1993, the first half, which is now the past and was the future when we issued our first forecast, is now over
Laura DAndrea Tyson, Head of the Presidents Council of Economic Advisors, quoted in November of 1993 in the Chicago Tribune, explaining why the Administration reduced its projections of economic growth to 2 percent from the 3.1percent it predicted in February.

Forecasting Problem
Suppose your fraternity/sorority house consumed the following number of cases of beer for the last 6 weekends: 8, 5, 7, 3, 6, 9 How many cases do you think your fraternity / sorority will consume this weekend?
10 9 8 7 6 5 4 3 2 1 0 0 1 2 3 Week 4 5 6 7

Cases

Forecasting: Simple Moving Average Method


Using a three period moving average, we would get the following forecast: 369 F 6 3
10 9 8 7 6 5 4 3 2 1 0 0 1 2 3 4 Week 5 6 7 8 Cases

Forecasting: Simple Moving Average Method


What if we used a two period moving average?
69 F 7 .5 2
10 9 8 7 6 5 4 3 2 1 0 0 1 2 3 4 Week 5 6 7 8

Cases

Forecasting: Simple Moving Average Method


The number of periods used in the moving average forecast affects the responsiveness of the forecasting method: 1 Period
10 9 8 7 6 5 4 3 2 1 0 0 1 2 3 4 Week 5 6 7 8

2 Periods 3 Periods

Cases

Forecasting Terminology
Applying this terminology to our problem using the Moving Average forecast:
t 1 2 3 4 5 6 7 8 9 10 A(t) 8 5 7 3 6 9 F(t)

Initialization ExPost Forecast Forecast

Model Evaluation

6.67 5 5.33 6 6 6 6

Forecasting: Weighted Moving Average Method


Rather than equal weights, it might make sense to use weights which favor more recent consumption values. With the Weighted Moving Average, we have to select weights that are individually greater than zero and less than 1, and as a group sum to 1: Valid Weights: (.5, .3, .2) , (.6,.3,.1), (1/2, 1/3, 1/6) Invalid Weights: (.5, .2, .1), (.6, -.1, .5), (.5,.4,.3,.2)

Forecasting: Weighted Moving Average Method


A Weighted Moving Average forecast with weights of (1/6, 1/3, 1/2), is performed as follows:
1 1 1 F ( )3 ( )6 ( )9 7 6 3 2

How do you make the Weighted Moving Average forecast more responsive?

Forecasting: Exponential Smoothing


Exponential Smoothing is designed to give the benefits of the Weighted Moving Average forecast without the cumbersome problem of specifying weights. In Exponential Smoothing, there is only one parameter ():

F( t 1) A( t ) (1 )F(t )
= smoothing constant (between 0 and 1)

Forecasting: Exponential Smoothing


Initialization: F(2) A(1)

F(3) A(2) (1 )F(2)

F(2) [ A(1) A(2)] / 2 F(3) A(2) (1 )F(2)

Forecasting: Exponential Smoothing


Using = 0.4,
Initialization
t 1 2 3 4 A(t) 8 5 7 3 6 9 6.5 5.9 6.34 5 5.4 6.84 6.84 6.84 F(t)

ExPost Forecast

5 6 7 8 9

Forecast

10

6.84

Forecasting: Exponential Smoothing


Period 1 2 3 4 5 6 7 Weight 0.1 (1 )6 0.05314 (1 )5 0.05905 (1 )4 0.06561 (1 )3 0.0729 (1 )2 0.081 (1 ) 0.09 0.1 0.3 0.03529 0.05042 0.07203 0.1029 0.147 0.21 0.3 0.5 0.00781 0.01563 0.03125 0.0625 0.125 0.25 0.5 0.7 0.00051 0.00170 0.00567 0.0189 0.063 0.21 0.7 0.9 0.00000 0.00001 0.00009 0.0009 0.009 0.09 0.9

Forecasting: Exponential Smoothing


1 0.9 0.8 0.7 Weight

0.6
0.5 0.4 0.3 0.2 0.1 0 1 2 3 4 Period 5 6 7

= 0.1 = 0.3 = 0.5 = 0.7


= 0.9

Outliers (eloping point)


Exp Exp = 0.3 = 0.7 6.50 6.05 6.04 5.12 4.79 7.85 7.85 7.85 7.85 6.50 5.45 5.84 3.85 3.96 11.69 11.69 11.69 11.69
16 14 12 10 8 6 4 2 0 1 2 3 4 5 6 7 8 9 10

t 1 2 3 4 5 6 7 8 9 10

A(t) 8 5 6 3 4 15

Outlier

Data with Trends

10 9 8 7 6 5 4 3 2 1 0 0 1 2 3 4 5 6 7

Data with Trends


10
9

8
7 6 5 4 3 2 1 0 1 2 3 4 5 6 7 A(t) = 0.3 = 0.5 = 0.7 = 0.9

Forecasting: Simple Linear Regression Model


Simple linear regression can be used to forecast data with trends
D

b a
0 1 2 3 4 5 I

D a bI

D is the regressed forecast value or dependent variable in the model, a is the intercept value of the regression line, and b is the slope of the regression line.

Forecasting: Simple Linear Regression Model


12 10 8 6 4 2 0 0 1 2 3 4 5 6 7

Error

In linear regression, the squared errors are minimized

Forecasting: Simple Linear Regression Model

D a bI
b ni1(IiDi ) (i1Ii )( i1Di )
n n n

n (I ) (i1Ii )
n 2 i1 i n

1 n b n a i1Di ( i1Ii ) n n

Limitations in Linear Regression Model


250 200 150

100
50 0 0 2 4 6 8 10 12 14 16

As with the simple moving average model, all data points count equally with simple linear regression.

Forecasting: Holts Trend Model


To forecast data with trends, we can use an exponential smoothing model with trend, frequently known as Holts model:

L(t) = A(t) + (1- ) F(t) T(t) = [L(t) - L(t-1) ] + (1- ) T(t-1)

F(t+1) = L(t) + T(t)


We could use linear regression to initialize the model

Holts Trend Model: Initialization


t 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 A(t) 32 38 50 61 52 63 72 53 99 92 121 153 183 179 224

First, well initialize the model:


x 1 2 3 4 Sum Average 2.5 y 32 38 50 61 45.25 x2 1 4 9 16 30 xy 32 76 150 244 502

xy - n(y)(x) = 502 - 4(45.25)(2.5) 49.5 9.9 b= 30 4(2.5) 5 x - n(x)


2 2 2

a = y - bx = 45.3 - (9.9)(2.5) 20.5

L(4) = 20.5+4(9.9)=60.1 T(4) = 9.9

Holts Trend Model: Updating


t 1 2 3 4 5 A(t) 32 38 50 61 L(t) T(t) F(t)

60.1

9.9

52

64.6

7.74

70

0.3 0.4

72.34

L(t) = A(t) + (1- ) F(t) L(5) = 0.3 (52) + 0.7 (70)=64.6 T(t) = [L(t) - L(t-1) ] + (1- ) T(t-1) T(5) = 0.4 [64.6 60.1] + 0.6 (9.9) = 7.74 F(t+1) = L(t) + T(t) F(6) = 64.6 + 7.74 = 72.34

Holts Trend Model: Updating


t 1 2 3 4 5 6 A(t) 32 38 50 61 52 L(t) T(t) F(t)

60.1 64.60

9.9 7.74

0.3 0.4
70 72.34

63 72

69.54 6.62

76.16

L(6) = 0.3 (63) + 0.7 (72.34)=69.54


T(6) = 0.4 [69.54 64.60] + 0.6 (7.74) = 6.62 F(7) = 69.54 + 6.62 = 76.16

Holts Model Results


t 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 A(t) 32 38 50 61 52 63 72 53 99 92 121 153 183 179 224 L(t) T(t) F(t)

60.1 64.60 69.54 74.91 72.62 82.46 89.24 103.01 124.41 151.82 173.55 202.92

9.9 7.74 6.62 6.12 2.76 5.59 6.06 9.15 14.05 19.39 20.33 23.94

Initialization
70 72.34 76.16 81.03 75.38 88.06 95.30 112.16 138.46 171.22 193.88 226.86 250.80 274.74 298.68

ExPost Forecast

Forecast

Holts Model Results


350 300 250 200 150 100

Regression

50
0

10

15

20

Initialization

ExPost Forecast

Forecast

Forecasting: Seasonal Model (No Trend)


A(t) 16 27 39 22 16 26 43 23 14 29 41 22
50 45 40 35 30 25 20 15 10 5 0
20 03 m m er 20 03 Fa ll 2 00 W 3 in te r2 00 Sp 3 rin g 20 Su 04 m m er 20 04 Fa ll 2 00 W 4 in te r2 00 Sp 4 rin g 20 Su 05 m m er 20 05 Fa ll 2 00 W 5 in te r2 00 5 Su

2003

2004

2005

Spring Summer Fall Winter Spring Summer Fall Winter Spring Summer Fall Winter

Sp

rin g

Seasonal Model Formulas L(t) = A(t) / S(t-p) + (1- ) L(t-1) S(t) = g [A(t) / L(t)] + (1- g) S(t-p) F(t+1) = L(t) * S(t+1-p)
p is the number of periods in a season Quarterly data: p = 4 Monthly data: p = 12

Seasonal Model Initialization


Quarter Average 16.0 26.5 41.0 22.5

2003

Spring Summer Fall Winter 2004 Spring Summer Fall Winter Average Sales per Quarter =

A(t) 16 27 39 22 16 26 43 23 26.5

Seasonal Factor S(t) 0.60 1.00 1.55 0.85

S(5) = 0.60 S(6) = 1.00 S(7) = 1.55 S(8) = 0.85 L(8) = 26.5

Seasonal Model Forecasting


Seasonal Factor S(t) 0.60 1.00 1.55 0.85 0.59 1.03 1.55 0.84

2004

2005

2006

Spring Summer Fall Winter Spring Summer Fall Winter Spring Summer Fall Winter

A(t) 16 26 43 23 14 29 41 22

L(t)

F(t)

g 0.3 0.4

26.50 25.18 26.71 26.62 26.34

16.00 25.18 41.32 22.60 15.53 27.02 40.69 22.25

Seasonal Model Forecasting


50 45 40

35 30
25 20 15 10 5 0 0 2 4 6 8 10 12 14 16

Forecasting: Winters Model for Data with Trend and Seasonal Components

L(t) = A(t) / S(t-p) + (1- )[L(t-1)+T(t-1)]


T(t) = [L(t) - L(t-1)] + (1- ) T(t-1) S(t) = g [A(t) / L(t)] + (1- g) S(t-p) F(t+1) = [L(t) + T(t)] S(t+1-p)

Seasonal-Trend Model Decomposition


To initialize Winters Model, we will use Decomposition Forecasting, which itself can be used to make forecasts.

Decomposition Forecasting
There are two ways to decompose forecast data with trend and seasonal components:
Use regression to get the trend, use the trend line to get seasonal factors Use averaging to get seasonal factors, deseasonalize the data, then use regression to get the trend.

41

Decomposition Forecasting
The following data contains trend and seasonal components:
Period 1 2 3 4 5 6 7 8 Quarter Spring Summer Fall Winter Spring Summer Fall Winter Sales 90 157 123 93 128 211 163 122
250 200 150

100 50 0 0 2 4 6 8 10

Decomposition Forecasting
The seasonal factors are obtained by the same method used for the Seasonal Model forecast:
Period 1 2 3 4 5 6 7 8 Quarter Spring Summer Fall Winter Spring Summer Fall Winter Average = Sales 90 157 123 93 128 211 163 122 135.9 Qtr. Ave. 109 184 143 107.5 Seas. Factor 0.80 1.35 1.05 0.79 1.00

Average to 1

Decomposition Forecasting
With the seasonal factors, the data can be deseasonalized by dividing the data by the seasonal factors: Deseasonalized Data
Sales 90 157 123 93 128 211 163 122 Seas. Factor 0.80 1.35 1.05 0.79 0.80 1.35 1.05 0.79 Deseas. Data 112.2 115.9 116.9 117.5 159.6 155.8 154.9 154.2
170.0 160.0 150.0 140.0 130.0 120.0 110.0 100.0 0 2 4 6 8 10

Regression on the De-seasonalized data will give the trend

Decomposition Forecasting Regression Results


Period (X) 1 2 3 4 5 6 7 8 SUM Average 4.5 Deseas. Sales (Y) 112.2 115.9 116.9 117.5 159.6 155.8 154.9 154.2 135.9 X2 1 4 9 16 25 36 49 64 204 XY 112.2 231.8 350.7 470 798 934.8 1084.3 1233.6 5215.4

xy - n( y)( x) = 5215.4 - 8(135.9)(4 .5) 324 = 7.71 m= 204 8( 4.5) 42 x - n( x)


2 2 2

b = y - mx = 135.9 - (7.71)(4. 5)=101 2 .

Decomposition Forecast
Regression on the de-seasonalized data produces the following results:
Slope (m) = 7.71 Intercept (b) = 101.2

Forecasts can be performed using the following equation


[mx + b](seasonal factor)

Decomposition Forecasting

Period 1 2 3 4 5 6 7 8 9 10 11 12

Quarter Spring Summer Fall Winter Spring Summer Fall Winter Spring Summer Fall Winter

Sales 90 157 123 93 128 211 163 122

Forecast 87.4 157.9 130.8 104.5 112.1 199.7 163.3 128.8 136.8 241.4 195.7 153.2

300

250
200 150

100
50 0

10

11

12

Winters Model Initialization


We can use the decomposition forecast to define the following Winters Model parameters:
L(n) = b + m (n) T(n) = m S(j) = S(j-p)

So from our previous model, we have


L(8) = 101.2 + 8 (7.71) = 162.88 T(8) = 7.71 S(5) = 0.80 S(6) = 1.35 S(7) = 1.05 S(8) = 0.79

Winters Model Example


Period 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 Quarter Spring Summer Fall Winter Spring Summer Fall Winter Spring Summer Fall Winter Spring Summer Fall Winter Sales 90 157 123 93 128 211 162 122 152 303 232 171 = 0.3 L(t) = 0.4 T(t) g = 0.2 S(t) F(t)

162.88 176.41 197.85 215.00 226.37

7.71 10.04 14.60 15.62 13.92

0.8 1.35 1.05 0.79 0.81 1.39 1.06 0.78

136.47 251.71 223.07 182.19 195.19 352.41 283.09 220.87

Winters Model Example


400
350 300 250 200 150 100 50 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16

Evaluating Forecasts Trust, but Verify


Ronald W. Reagan

Computer software gives us the ability to mess up more data on a greater scale more efficiently While software like SAP can automatically select models and model parameters for a set of data, and usually does so correctly, when the data is important, a human should review the model results One of the best tools is the human eye

Visual Review
How would you evaluate this forecast?
60 50 40 30 20 10 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

Forecast Evaluation
Where Forecast is Evaluated
400 350 300 250 200 150

Do not include initialization data in evaluation

100
50 0 0 10 20 30 40 50

Initialization

ExPost Forecast

Forecast

Errors
400 350

300
250

200
150 100 50 0 20 25 30 35 40

All error measures compare the forecast model to the actual data for the ExPost Forecast region

Errors Measure
t 25 26 27 28 29 30 31 32 33 34 35 36 F(t) 95 125 197 227 230 274 274 255 244 211 114 85 A(t) 91 137 193 199 278 344 291 250 171 152 111 127 Sum Error F(t) - A(t) 4 -12 4 28 -48 -70 -17 5 73 59 3 -42 -13 | F(t) - A(t) | 4 12 4 28 48 70 17 5 73 59 3 42 365

All error measures are based on the comparison of forecast values to actual values in the ExPost Forecast regiondo not include data from initialization.

Bias and MAD


t 25 26 F(t) 95 125 A(t) 91 137 Error F(t) - A(t) 4 -12 | F(t) - A(t) | 4 12

35 36

114 85

111 127 Sum

3 -42 -13

3 42 365

Bias
MAD

F(t) A(t) 13 1.08


n 12

F( t ) A ( t )
n

365 30.42 12

Bias and MAD


Bias tells us whether we have a tendency to overor under-forecast. If our forecasts are in the middle of the data, then the errors should be equally positive and negative, and should sum to 0. MAD (Mean Absolute Deviation) is the average error, ignoring whether the error is positive or negative. Errors are bad, and the closer to zero an error is, the better the forecast is likely to be. Error measures tell how well the method worked in the ExPost forecast region. How well the forecast will work in the future is uncertain.

Absolute vs. Relative Measures


Forecasts were made for two sets of data. Which forecast was better?
Data Set 1 Bias = 18.72 MAD = 43.99 Data Set 1
1000 900 800 700 600 500 400 300 200 100 0 1 2 3 4 5 6 7 8 9 10 11 12

Data Set 2 Bias = 182 MAD = 912.5 Data Set 2


30000 25000 20000 15000 10000 5000 0 1 2 3 4 5 6 7 8 9 10 11 12

MPE and MAPE


When the numbers in a data set are larger in magnitude, then the error measures are likely to be large as well, even though the fit might not be as good. Mean Percentage Error (MPE) and Mean Absolute Percentage Error (MAPE) are relative forms of the Bias and MAD, respectively. MPE and MAPE can be used to compare forecasts for different sets of data.

MPE and MAPE


Mean Percentage Error (MPE) F( t ) A ( t ) A(t) MPE n Mean Absolute Percentage Error (MAPE)
F( t ) A ( t ) A(t) MAPE n

MPE and MAPE


Data Set 1
t 9 10 11 12 13 14 15 16 17 18 19 20 A(t) 177 275 363 91 194 376 659 146 219 514 875 130 F(t) 125.4 338.85 493.2 89.1 176 463.05 658.8 116.7 226.6 587.25 824.4 144.3 F(t) - A(t) A(t) -0.292 0.232 0.359 -0.021 -0.093 0.232 0.000 -0.201 0.035 0.143 -0.058 0.110 0.446 | F(t) - A(t) | A(t) 0.292 0.232 0.359 0.021 0.093 0.232 0.000 0.201 0.035 0.143 0.058 0.110 1.774

MPE

0.446 0.037 12
1.774 0.148 12

MAPE

MPE and MAPE


Data Set 2
t 9 10 11 12 13 14 15 16 17 18 19 20 A(t) 6332 12994 21325 3527 7283 14963 24325 4054 8173 16804 27458 4496 F(t) 5973 15147 20844 3582 6765 17091 23436 4014 7557 19035 26028 4446 F(t) - A(t) A(t) -0.057 0.166 -0.023 0.016 -0.071 0.142 -0.037 -0.010 -0.075 0.133 -0.052 -0.011 0.121 | F(t) - A(t) | A(t) 0.057 0.166 0.023 0.016 0.071 0.142 0.037 0.010 0.075 0.133 0.052 0.011 0.792

MPE
MAPE

0.121 0.010 12
0.792 0.066 12

MPE and MAPE


Data Set 1
0.446 0.037 12 1.774 MAPE 0.148 12 MPE
1000 900 800 700 600 500 400 300 200 100 0 1 2 3 4 5 6 7 8 9 10 11 12
0 1 2 3 4 5 6 7 8 9 10 11 12 10000 5000 25000 20000 15000 30000

Data Set 2
0.121 0.010 12 0.792 MAPE 0.066 12 MPE

Tracking Signal
Whats happened in this situation? How could we detect this in an automatic forecasting environment?
60 50

40
30 20 10

0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

Tracking Signal
The tracking signal can be calculated after each actual sales value is recorded. The tracking signal is calculated as: RSFE F( t ) A ( t ) TS(t) MAD(t) F(t) A(t)
n

The tracking signal is a relative measure, like MPE and MAPE, so it can be compared to a set value (typically 4 or 5) to identify when forecasting parameters and/or models need to be changed.

Tracking Signal
t 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 A(t) 15.1 16.8 11.4 18.7 11.8 17.2 12.9 22.9 24.0 32.6 38.5 36.6 40.6 51.0 51.9 F(t) 15.9 14.6 15.8 14.6 15.4 14.6 17.1 19.2 23.2 27.8 30.4 33.5 38.7 42.7 F(t) - A(t) RSFE | F(t) - A(t) | S | F(t) - A(t) | MAD TS

3.2 -2.9 2.8 -1.8 1.7 -5.8 -4.8 -9.4 -10.7 -6.2 -7.1 -12.3 -9.2

3.2 0.3 3.1 1.3 3.0 -2.8 -7.6 -17.0 -27.7 -33.9 -41.0 -53.3 -62.5

3.2 2.9 2.8 1.8 1.7 5.8 4.8 9.4 10.7 6.2 7.1 12.3 9.2

3.2 6.1 8.9 10.7 12.4 18.2 23 32.4 43.1 49.3 56.4 68.7 77.9

3.20 3.05 2.97 2.68 2.48 3.03 3.29 4.05 4.79 4.93 5.13 5.73 5.99

1.00 0.10 1.04 0.49 1.21 -0.92 -2.31 -4.20 -5.78 -6.88 -8.00 -9.31 -10.43

Tracking Signal

60 50

TS = -5.78
40

30
20 10 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

You might also like