You are on page 1of 76

Forecasting

Prediction is very difficult,


especially if it's about the future.
Nils Bohr
Objectives
Give the fundamental rules of forecasting

Calculate a forecast using a moving


average, weighted moving average, and
exponential smoothing

Calculate the accuracy of a forecast


What is forecasting?

Forecasting is a tool used for predicting


future demand based on
past demand information.
Forecasting Horizons
Long Term
5+ years into the future
R&D, plant location, product planning
Principally judgment-based
Medium Term
1 season to 2 years
Aggregate planning, capacity planning, sales forecasts
Mixture of quantitative methods and judgment
Short Term
1 day to 1 year, less than 1 season
Demand forecasting, staffing levels, purchasing, inventory
levels
Quantitative methods
Short Term Forecasting:
Needs and Uses
Scheduling existing resources
How many employees do we need and when?
How much product should we make in anticipation of demand?
Acquiring additional resources
When are we going to run out of capacity?
How many more people will we need?
How large will our back-orders be?
Determining what resources are needed
What kind of machines will we require?
Which services are growing in demand? declining?
What kind of people should we be hiring?
Why is forecasting important?

Demand for products and services is usually uncertain.


Forecasting can be used for
Strategic planning (long range planning)
Finance and accounting (budgets and cost controls)
Marketing (future sales, new products)
Production and operations
Forecasting for Strategic Business Planning

Among the decisions that require long-term


Capital expansion projects.
Proposals to develop a new product line.
Merger or acquisition opportunities.
Forecasts are usually stated in very general terms.
Useful techniques: Management judgment, economic
growth models, regression.
Forecasting for Sales and Operations
Planning
Provide the basis for plans that are usually stated in
terms of planned sales and output of product
families in dollars or some other aggregate
measure.
Useful techniques: Aggregation of detailed forecast,
customer plans, regression
A common mean for producing aggregate forecast for
SOP s to sum the forecast for the individual products
in each product line.
Forecasting for Master Production
Scheduling and Control

The flow of forecast information to the MPS is


frequent and detailed.
Useful techniques: Projection techniques (moving
averages, exponential smoothing)
What is forecasting all about?

Demand for Mercedes E Class We try to predict the


future by looking back at
the past

Predicted
demand
looking
Time back six
Jan Feb Mar Apr May Jun Jul Aug months
Actual demand (past sales)
Predicted demand
Whats Forecasting All About?
From the March 10, 2006 WSJ:

Ahead of the Oscars, an economics professor, at the request of Weekend


Journal, processed data about this year's films nominated for best picture
through his statistical model and predicted with 97.4% certainty that
"Brokeback Mountain" would win. Oops. Last year, the professor tuned
his model until it correctly predicted 18 of the previous 20 best-picture
awards; then it predicted that "The Aviator" would win; "Million Dollar
Baby" won instead.

Sometimes models tuned to prior results don't have great predictive


powers.
Some general characteristics of forecasts

First law of forecasting: Forecasts are always


wrong
Second law of forecasting: Detailed forecasts are
worse than aggregate forecasts.
Forecasts are more accurate for groups or families of items
Third law of forecasting: The further into the
future, the less reliable the forecast will be.
Forecasts are more accurate for shorter time periods

Every forecast should include an error estimate


Forecasts are no substitute for calculated demand.
Key issues in forecasting

1. A forecast is only as good as the information included in


the forecast (past data)
2. History is not a perfect predictor of the future (i.e.: there
is no such thing as a perfect forecast)

REMEMBER: Forecasting is based on the assumption


that the past predicts the future! When forecasting, think
carefully whether or not the past is strongly related to
what you expect to see in the future
Example: Mercedes E-class vs. M-class Sales
Month E-class Sales M-class Sales
Jan 23,345 -
Feb 22,034 -
Mar 21,453 -
Apr 24,897 -
May 23,561 -
Jun 22,684 -
Jul ? ?

Question: Can we predict the new model M-class sales


based on the data in the the table?

Answer: Maybe... We need to consider how much the


two markets have in common
What should we consider when looking at
past demand data?

Trends

Seasonality

Cyclical elements

Autocorrelation

Random variation
Some Important Questions

What is the purpose of the forecast?


Which systems will use the forecast?
How important is the past in estimating the future?

Answers will help determine time horizons,


techniques, and level of detail for the forecast.
Forecasting Steps
1. What needs to be forecast?
Level of detail, units of analysis & time horizon required
2. What data is available to evaluate?
Identify needed data & whether its available
3. Select and test the forecasting model
Cost, ease of use & accuracy
4. Generate the forecast
5. Monitor forecast accuracy over time
Types of Forecasting Models
Qualitative (technological) methods:
Forecasts generated subjectively by the forecaster

Quantitative (statistical) methods:


Forecasts generated through mathematical
modeling
Qualitative Methods
Type Characteristics Strengths Weaknesses
Executive A group of managers Good for strategic or One person's opinion
opinion meet & come up with new-product can dominate the
a forecast forecasting forecast

Market Uses surveys & Good determinant of It can be difficult to


research interviews to identify customer preferences develop a good
customer preferences questionnaire

Delphi Seeks to develop a Excellent for Time consuming to


method consensus among a forecasting long-term develop
group of experts product demand,
technological
changes, and
Statistical Forecasting
Time Series Models:
Assumes the future will follow same patterns as the past

Causal Models:
Explores cause-and-effect relationships
Uses leading indicators to predict the future
E.g. housing starts and appliance sales

Simulation:
Models that can incorporate some randomness and non-
linear effects
Composition of Time Series Data

Data = historic pattern + random variation


Historic pattern may include:
Level (long-term average)
Trend
Seasonality
Cycle
Time Series Patterns
Methods of Forecasting the Level
Nave Forecasting
Simple Mean
Moving Average
Weighted Moving Average
Exponential Smoothing
Time Series Problem
Determine forecast for
periods 11
Nave forecast
Period Orders
Simple average 1 122
3- and 5-period moving 2 91
average 3 100
3-period weighted moving 4 77
average with weights 0.5, 5 115
0.3, and 0.2 6 58
Exponential smoothing 7 75
with alpha=0.2 and 0.5 8 128
9 111
10 88
11
Time Chart of Orders Data
140

120

100

80

60

40

20

0
1 2 3 4 5 6 7 8 9 10
Nave Forecasting

Next period forecast = Last Periods


actual:
Ft 1 At
Simple Average (Mean)
Next periods forecast = average of all
historical data

At At 1 At 2 .............
Ft 1
n
Time Series: Moving average
The moving average model uses the last t periods in order to
predict demand in period t+1.
There can be two types of moving average models: simple
moving average and weighted moving average
The moving average model assumption is that the most accurate
prediction of future demand is a simple (linear) combination of
past demand.
Time series: simple moving
average
In the simple moving average models the forecast value is:

At + At-1 + + At-n
Ft+1 =
n

t is the current period.


Ft+1 is the forecast for next period
n is the forecasting horizon (how far back we look),
A is the actual sales figure from each period.
Moving Average
Next periods forecast = simple average
of the last N periods

At At 1 ......... At N 1
Ft 1
N
Example: forecasting sales at Kroger
Kroger sells (among other stuff) bottled spring water

Month Bottles
Jan 1,325
Feb 1,353 What will the
Mar 1,305 sales be for
Apr 1,275 July?
May 1,210
Jun 1,195
Jul ?
What if we use a 3-month simple moving average?

AJun + AMay + AApr


FJul = = 1,227
3

What if we use a 5-month simple moving average?

AJun + AMay + AApr + AMar + AFeb


FJul = = 1,268
5
1400
1350
1300
5-month
1250
MA forecast
1200
3-month
1150 MA forecast
1100
1050
1000
0 1 2 3 4 5 6 7 8

What do we observe?

5-month average smoothes data more;


3-month average more responsive
The Effect of the Parameter N

A smaller N makes the forecast more


responsive
A larger N makes the forecast more
stable
Time series: weighted moving
average
We may want to give more importance to some of the data
Ft+1 = wt At + wt-1 At-1 + + wt-n At-n
wt + wt-1 + + wt-n = 1

t is the current period.


Ft+1 is the forecast for next period
n is the forecasting horizon (how far back we look),
A is the actual sales figure from each period.
w is the importance (weight) we give to each period
Why do we need the WMA models?
Because of the ability to give more importance to what
happened recently, without losing the impact of the past.

Demand for Mercedes E-class Actual demand (past sales)


Prediction when using 6-month SMA
Prediction when using 6-months WMA

For a 6-month
SMA, attributing
equal weights to all
past data we miss
Time the downward trend
Jan Feb Mar Apr May Jun Jul Aug
Example: Kroger sales of bottled
water
Month Bottles
Jan 1,325
Feb 1,353
Mar 1,305
What will be the sales
Apr 1,275 for July?
May 1,210
Jun 1,195
Jul ?
6-month simple moving average

AJun + AMay + AApr + AMar + AFeb + AJan


FJul = = 1,277
6

In other words, because we used equal weights, a slight


downward trend that actually exists is not observed
What if we use a weighted moving
average?
Make the weights for the last three months more than the
first three months

6-month WMA WMA WMA


SMA 40% / 60% 30% / 70% 20% / 80%

July
1,277 1,267 1,257 1,247
Forecast

The higher the importance we give to recent data, the more


we pick up the declining trend in our forecast.
How do we choose weights?
1. Depending on the importance that we feel past data has
2. Depending on known seasonality (weights of past data
can also be zero).

WMA is better than SMA


because of the ability to
vary the weights!
Time Series: Exponential Smoothing
(ES)
Main idea: The prediction of the future depends mostly on the
most recent observation, and on the error for the latest
forecast.

Smoothing
Denotes the importance
constant
of the past error
alpha
Why use exponential smoothing?

1. Uses less storage space for data


2. Extremely accurate
3. Easy to understand
4. Little calculation complexity
5. There are simple accuracy tests
Exponential smoothing: the method
Assume that we are currently in period t. We calculated the
forecast for the last period (Ft-1) and we know the actual
demand last period (At-1)

Ft Ft 1 ( At 1 Ft 1 )
Ft At 1 (1 ) Ft 1
0 1
The smoothing constant expresses how much our forecast
will react to observed differences
If is low: there is little reaction to differences.
If is high: there is a lot of reaction to differences.
Example: bottled water at Kroger
Month Actual Forecasted = 0.2
Jan 1,325 1,370

Feb 1,353 1,361

Mar 1,305 1,359

Apr 1,275 1,349

May 1,210 1,334

Jun ? 1,309
Example: bottled water at Kroger

Month Actual Forecasted = 0.8

Jan 1,325 1,370

Feb 1,353 1,334

Mar 1,305 1,349

Apr 1,275 1,314

May 1,210 1,283

Jun ? 1,225
Impact of the smoothing constant

1380
1360
1340
1320 Actual
1300
a = 0.2
1280
1260 a = 0.8
1240
1220
1200
0 1 2 3 4 5 6 7
Time Series Problem Solution

Simple Simple Weighted Exponential Exponential


Nave Simple Moving Moving Moving Smoothing Smoothing
Period Orders (A) Forecast Average Average (N=3) Average(N=5) Average (N=3) ( = 0.2) ( = 0.5)
1 122 122 122
2 91 122 122 122 122
3 100 91 107 116 107
4 77 100 104 104 102 113 104
5 115 77 98 89 87 106 91
6 58 115 101 97 101 101 108 103
7 75 58 94 83 88 79 98 81
8 128 75 91 83 85 78 93 78
9 111 128 96 87 91 98 100 103
10 88 111 97 105 97 109 102 107

11 88 97 109 92 103 99 98
Trend..
What do you think will happen to a
moving average or exponential smoothing
model when there is a trend in the data?
Impact of trend

Sales
Actual
Regular exponential
Data smoothing will always
Forecast lag behind the trend.
Can we include trend
analysis in exponential
smoothing?

Month
Exponential smoothing with trend
FIT: Forecast including trend
: Trend smoothing constant
FITt Ft Tt (chosen by the user)

Ft FITt1 (At1 FITt1 )

Tt Tt1 (Ft FITt1 )

The idea is that the two effects are decoupled,


(F is the forecast without trend and T is the trend component)
Example: bottled water at Kroger

At Ft Tt FITt = 0.8
= 0.5
Jan 1325 1380 -10 1370
Feb 1353 1334 -28 1306
Mar 1305 1344 -9 1334
Apr 1275 1311 -21 1290
May 1210 1278 -27 1251
Jun 1218 -43 1175
Exponential Smoothing with Trend

1400

1350
Actual
1300
a = 0.2

1250 a = 0.8
a = 0.8, d = 0.5
1200

1150
0 1 2 3 4 5 6 7
Forecasting For Seasonal Series
Seasonality corresponds to a pattern in the data
that repeats at regular intervals. (See figure next
slide)
For instance, ice cream, air conditioners have peaks
associated with summer. Toys and gift items
experiences spikes in demand right before
Christmas.
Multiplicative seasonal factors: c1 , c2 , . . . , cN where
i = 1 is first season of cycle, i = 2 is second season
of the cycle, etc.
S ci = N
ci = 1.25 implies a demand 25% higher than the
baseline
ci = 0.75 implies 25% lower than the baseline
A Seasonal Demand Series
Quick and Dirty Method of Estimating
Seasonal Factors
Compute the sample mean of the entire data
set (should be at least several cycles of data)
Divide each observation by the sample mean
(This gives a factor for each observation)
Average the factors for like seasons
The resulting n numbers will exactly
add to N and correspond to the N
seasonal factors.
Deseasonalizing a Series
To remove seasonality from a series, simply
divide each observation in the series by the
appropriate seasonal factor. The resulting series
will have no seasonality and may then be
predicted using an appropriate method.
Once a forecast is made on the deseasonalized
series, one then multiplies that forecast by the
appropriate seasonal factor to obtain a forecast
for the original series.
Lets do one:
Season Cycle Demand Season Cycle Demand
Q1 2001 205 Q1 2002 225
Q2 2001 225 Q2 2002 248
Q3 2001 185 Q3 2002 203
Q4 2001 285 Q4 2002 310

Expected Demand Q103 = (205+225)/2 = 215


Expected Demand Q203 = (225+248)/2 = 236.5
Expected Demand Q303 = (185+203)/2 = 194
Expected Demand Q403 = (285+310)/2 = 298
Overall average demand: SDi/8 = 235.75
Lets do one:
Seasonal Demand Factors:
Q1: 215/235.75 = 0.912
Q2: 236.5/235.75 = 1.003
Q3: 194/235.75 = 0.823
Q4: 297.5/235.75 = 1.262
Sum: 4.000
The seasonal factors must sum to the number
of seasons in the cycle
if they do not factors must be Normalized: {(number
of seasons in cycle)/Scj}*cji where i represents each
season in the cycle
Deseasonalize the Demand
For each period: Di/cj

Period A. Demand Period Avg. Period Deseason


Factor Demand (y)
Q1 01 205 215 .912 224.78
Q2 01 225 236.5 1.003 224.29
Q3 01 185 194 .823 224.81
Q4 01 285 298 1.262 225.84
Q1 02 225 215 .912 246.72
Q2 02 248 236.5 1.003 247.21
Q3 02 203 194 .823 246.64
Q4 02 310 298 1.262 245.66
But what about new data?
Same problem prevails as before updating is
expensive
As new data becomes available, we must start
over to get seasonal factors, trend and intercept
estimates
Isnt there a method to smooth this
seasonalized technique?
Yes, its called Winters Method or triple
exponential smoothing
The Winters Method for Seasonality
Suggested by Winters (1960)
The basic idea is to estimate a multiplicative
seasonality factor c(t), t=1,2,, where c(t)
represents the ratio of demand during period t to
the average demand during the season. Therefore,
if there are N periods in the season (for example,
N=12 if periods are months and the season is a
year), then the sum of the c(t) factors over the
season will always be equal to N. The seasonally
adjusted forecast is computed by multiplying the
forecast from the exponential smoothing with
linear trend model by the appropiate seasonality
factor.
Linear regression in forecasting
Linear regression is based on
1. Fitting a straight line to data
2. Explaining the change in one variable through changes
in other variables.

dependent variable = a + b (independent variable)

By using linear regression, we are trying to explore which


independent variables affect the dependent variable
Example: do people drink more when
its cold?
Alcohol Sales

Which line best


fits the data?

Average Monthly
Temperature
The best line is the one that minimizes
the error
The predicted line is

Y a bX

So, the error is

i yi - Yi

Where: is the error


y is the observed value
Y is the predicted value
Least Squares Method of Linear
Regression

The goal of LSM is to minimize the sum of squared


errors

Min i
2
What does that mean?

Alcohol Sales

So LSM tries to
minimize the distance
between the line and
the points!

Average Monthly
Temperature
Least Squares Method of Linear
Regression
Then the line is defined by

Y a bX

a y bx

b
xy nx y
x nx 2 2
How can we compare across
forecasting models?
We need a metric that provides estimation of accuracy

Errors can be:


Forecast Error 1. biased
2. random

Forecast error = Difference between actual and forecasted


value (also known as residual)
Measuring Accuracy: MFE
MFE = Mean Forecast Error (Bias)
It is the average error in the observations

A F t t
M FE i 1
n
1. A more positive or negative MFE implies worse
performance; the forecast is biased.
More critical: Wecan compesate for forecaste errors through
inventory, expediting, faster delivery means, and other kind
of responses.
Measuring Accuracy: MAD
MAD = Mean Absolute Deviation
It is the average absolute error in the observations

A F t t
M AD i1
n
1. Higher MAD implies worse performance.
2. MAD also measures deviation (error) from the
expected result (the forecast).
2. If errors are normally distributed, then =1.25MAD
Key Point

Forecast must be measured for accuracy!

The most common means of doing so is by


measuring the either the mean absolute
deviation or the standard deviation of the
forecast error
Measuring Accuracy: Tracking signal
The tracking signal is a measure of how often our estimations
have been above or below the actual value. It is used to
decide when to re-evaluate using a model.
n
RSFE (At Ft )
RSFE
TS
i1 MAD
Positive tracking signal: most of the time actual values are
above our forecasted values
Negative tracking signal: most of the time actual values are
below our forecasted values

If TS > 4 or < -4, investigate!


Example: bottled water at Kroger
Month Actual Forecast Month Actual Forecast

Jan 1,325 1,370 Jan 1,325 1370

Feb 1,353 1,361 Feb 1,353 1306

Mar 1,305 1,359 Mar 1,305 1334

Apr 1,275 1,349 Apr 1,275 1290

May 1,210 1,334 May 1,210 1251

Jun 1,195 1,309 Jun 1,195 1175

Exponential Smoothing Forecasting with trend


( = 0.2) ( = 0.8)
( = 0.5)

Question: Which one is better?


Bottled water at Kroger: compare
MAD and TS
MAD TS

Exponential
70 - 6.0
Smoothing

Forecast
33 - 2.0
Including Trend

We observe that FIT performs a lot better than ES

Conclusion: Probably there is trend in the data which


Exponential smoothing cannot capture
Which Forecasting Method Should You
Use
Gather the historical data of what you want to
forecast
Divide data into initiation set and evaluation set
Use the first set to develop the models
Use the second set to evaluate
Compare the MADs and MFEs of each model

You might also like