You are on page 1of 50

Forecasting

Y.-H. Chen, Ph.D.


Production / Operations Management
International College
Ming-Chuan University
Forecasting Outline
 Introduction
 Forecasting Process
 Forecasting Methods
 Forecast Accuracy and Control
 Forecast Method Selection and Usage
I see that you will
Introduction get an A this semester.

A forecast
 is a statement of future,

 is a basis for planning,

 is not for forecasting demand only,

 requires a skillful blending of art and science,

 assumes that the underlying system will


continue to exist in the future, and
 is rarely perfect.
Forecasting Process

“The forecast”

Step 6 Monitor the forecast


Step 5 Prepare the forecast
Step 4 Gather and analyze data
Step 3 Select a forecasting technique
Step 2 Establish a time horizon
Step 1 Determine purpose of forecast
Elements of A Good Forecast
 The forecast horizon must cover the time necessary
to implement possible changes.
 The degree of accuracy should be stated.
 The forecast should be reliable; it should work
consistently.
 The forecast should be expressed in meaningful
units.
 The forecast should be in writing.
 The forecast should be simply to understand and
use, or consistent with historical data intuitively.
Additional Properties
 Forecasts for groups of
items tend to be more
accurate than forecasts for
individual items, because Timely
forecasting errors among
items in a group usually
have a canceling effect. Reliable Accurate
 Forecast accuracy
decreases as the time
period covered by the Written

forecast increases.
Forecasting Methods
 Basic Methods
 Judgmental Forecast
 Statistical (Time Series) Forecast
 Trend
 Seasonality
 Cycle
 Association
Basic Forecasting Methods
 Judgmental Forecast
 Statistical (Time Series) Forecast
 Averaging
 Weighted Moving Average
 Exponential Smoothing
Judgmental Forecast
Executive opinions.
Mostly for long-range planning and introduction of new products. The view of one
person may prevail.
Direct customer contact composites.
•Unable to distinguish between what customers would like to do and what they
will actually do.
•Could overly influenced by recent sales experiences. Low sales could lead to
low estimates.
•Conflict of interest. Low sales estimates lead to better sales performance.
Consumer survey or point-of-sales (POS) data.
•Expensive and time-consuming.
•Possible existence of irrational patterns.
•Low response rates.
Opinions of managers and staff.
Delphi method (Rand Corp., 1948): Managers and staff complete a series of
questionnaires, each developed from the previous one, to achieve a consensus
forecast.
Technological forecasting. Long-term single-time forecasting. Data are costly to obtain.
Statistical (Time Series)
Forecast
 It is extremely important to plot data and examine them before doing
any analysis or forecast. A demand forecast should be based on a time
series of past demand rather than sales or shipment.

Data patterns:
Trend
A long term upward or downward movement in data.
Seasonality
Short-term regular variations related to weather, holiday, or other factors.
Cycle
Wavelike variation lasting more than one year.
Irregular Variation
Caused by unusual circumstances, not reflective of typical behavior.
Random variation
Residual variation after all other behaviors are accounted for.
Data Patterns
Irregular
variation

Trend

Cycles

90
89
88
Seasonal variations
Simple Naive Forecast

Yt  yt 1

•No cost.
•Quick and easy to prepare.
•Easy to understand.
•Can be applied to data with seasonality and trend
General Naive Forecast

Yt  yt 1  ct
Weighted Moving Average
n

y i n
MAn  i 1
or MAn   wi yi
n i 1
Moving Average Example
a) Compute a 3-period moving average Period Demand
forecast given demand for shopping
carts for the last five periods. 1 42
43  40  41 2 40
MA3   41.33
3
3 43
b) If the actual demand in period 6 turns out 4 40
to be 39, what would be the moving
average forecast for period 7?
5 41
40  41  39
6 ? 41.33 39
MA3   40.00
3 7 ? 40.00
Weighted Moving Average
Example
a) Compute a weighted average forecast
using a weight of .40 for the most recent Period Demand
period, .30 for the next most recent, .20 for
the next, and .10 for the next. 1 42
Forecast  .40(41)  .30(40)  .20(43)  .10(40)  41.0 2 40
3 43
b) If the actual demand in period 6 turns out 4 40
to be 39, what would be the weighted
moving average forecast for period 7?
5 41
6 ? 41.0 39
Forecast  .40(39)  .30(41)  .20(40)  .10(43)  40.2
7 ? 40.2
Properties of Weighted Moving
Average
 Easy to compute and understand.
 Moving average forecast lags and
smoothens the actual forecast.
 The number of data points in the
average determines its sensitivity to
each new data point: the fewer the
data points in an average, the more
responsive the average tends to be.
 Weights can be added to values in
the average to make the resulting
average more responsive to some
recent data points. However, weights
involve the use of trial-and-error to
find suitable weights.
Exponential Smoothing
 Exponential smoothing is a weighted averaging method based
on previous forecast plus a percentage of its forecast error.

Yt  yt 1   (dt 1  yt 1 )
Properties of Exponential
Smoothing
 Commonly used values of
alpha range from 0.05 to 0.50.
Low values are used when
the underlying average tends
to be stable; higher values
are used when the underlying
average is susceptible to
change.
 Moving average or naive
forecast can be used to
generate starting forecast for
exponential smoothing.
Picking A Smooth Constant

1400
1350
1300
1250
1200
Demand
1150
F (.05)
1100
F(.2)
1050
1000
950
900
April July October January
Exponential Smoothing
Example
Period (t) Actual
Use exponential smoothing to
1 42
develop a series of forecasts
2 40
for the data, and compute 3 43
(actual-forecast)=error for 4 40
each period. 5 41
a) Use a smoothing factor of .10. 6 39
b) Use a smoothing factor of .40. 7 46
c) Plot the actual data and both 8 44
sets of forecasts on a single 9 45
graph. 10 38
11 40
12 ?
Exponential Smoothing
Example
alpha=.10 alpha=.40

Period (t) Actual Forecast Error Forecast Error

1 42.00

2 40.00 42.00 -2.00 42.00 -2.00

3 43.00 41.80 1.20 41.20 1.80

4 40.00 41.92 -1.92 41.92 -1.92

5 41.00 41.73 -0.73 41.15 -0.15

6 39.00 41.66 -2.66 41.09 -2.09

7 46.00 41.39 4.61 40.25 5.75

8 44.00 41.85 2.15 42.55 1.45

9 45.00 42.07 2.93 43.13 1.87

10 38.00 42.36 -4.36 43.88 -5.88

11 40.00 41.92 -1.92 41.53 -1.53

12 41.73 40.92
Forecasting Method Extension
 Trend
 Linear Trend
 Trend-Adjusted Exponential Smoothing
 Seasonality
 Cycle
 Association
Linear Trend

Yt  a  bt
Linear Trend Coefficients
n n n n n
n tyt   t  yt
b t 1 t 1 t 1 y t  b t
n
  n 2
a t 1 t 1
 y  bt
n t    t 
2
n
t 1  t 1 
Linear Trend Example
Calculate sales for a California-based Week Unit Sales
1 700
firm over the last 10 weeks are
2 724
shown in the table. Plot the data 3 720
and visually check to see if a linear 4 728
trend line would be appropriate. 5 740
Then, determine the equation of the 6 742
7 758
trend line, and predict sales for
8 750
weeks 11 and 12. 9 770
10 775
11 ?
12 ?
Linear Trend Example Solution
a. A plot suggests that a linear trend line would be appropriate.
b. For n=10, we have
10 10

 t  55 t  385
2

t 1
t 1

10(41,358)  55(7,407) 6,195


b   7.51
10(385)  55(55) 825
7,407  7.51(55)
a  699.40
10
Thus, the trend line is yt=699.40+7.51t,
where t=0 for period 0.
c. By letting t=11 and t=12, we have
y11  699.40  7.51(11)  782.01
y12  699.40  7.51(12)  789.51
Seasonality
Cycle
 Cycles are similar to
seasonal variations but of
longer duration, e.g., two to
six years between peaks.
 It is difficult to project cycles
from past data, because
turning points are difficult to
identify.
 A short moving average or a
naive approach may be of
some value.
Associative Forecasts
 High correlation of a
forecast with leading Y  a  bX
variables can be useful
in computing the
forecast.
 The simple linear
regression is the
simplest and most
widely used method.
Simple Linear Regression
Coefficients
 n   n  n 
n  xi yi     xi   yi 
b   i 1   i 1  i 1 
2
 n
2  n

n  xi     xi 
 i 1   i 1 
n n

y i  b xi
a i 1 i 1
 y  bx
n
Simple Linear Regression
Example
Sales, X Profits, Y
Healthy Hamburgers has (in millions of dollars)
a chain of 12 stores in 7 0.15
northern Illinois. Sales 2 0.10
figures and profits for 6 0.13
the stores are given in 4 0.15
the following table. 14 0.25
Obtain a regression line 15 0.27

for the data and predict 16 0.24


12 0.20
profit for a store
14 0.27
assuming sales of $10
20 0.44
million.
15 0.34
7 0.17
Simple Linear Regression
Example: Data Plot
0.50
0.45
Prodits ($ million)

0.40
0.35
0.30
0.25
0.20
0.15
0.10
0.05
0.00
0 5 10 15 20 25
Sales ($ million)
Simple Linear Regression
Example: Solution
x y xy x^2 y^2
7 0.15 1.05 49 0.0225 n xy   x  y 
b
2
6
0.10
0.13
0.20
0.78 36
4 0.0100
0.0169
 
n  x 2   x 
2

12(35.29)  132(2.71)
4 0.15 0.60 16 0.0225   0.01593
14 0.25 3.50 196 0.0625 12(1796)  132(132)
15 0.27 4.05 225 0.0729
a
 y  b x 
16 0.24 3.84 256 0.0576 n
12 0.20 2.40 144 0.0400 2.71  0.01593(132)
14 0.27 3.78 196 0.0729
  0.0506
12
20 0.44 8.80 400 0.1936 Y  0.0506  0.01593 X
15 0.34 5.10 225 0.1156
Y  0.2099 (million $) for X  10
7 0.17 1.19 49 0.0289
132 2.71 35.29 1796 0.7159
An Important Measure of
Simple Linear Regression
 Correlation measures the strength and direction of the
relationship between two variables.
 +1, positive correlation. n xy   x  y 
r
 -1, negative correlation.
   
n  x 2   x   n  y 2   y 
2 2

 0, zero correlation.
 The square of the correlation coefficient provides a
measure of how well a regression line “fits” the data. The
values ranges from 0 to 1.00.
 [0.80,1.00], good fit.
2
 [0.25,0.80), moderate fit. r
 [0.00,0.25), poor fit.
Simple Linear Regression and
Correlation Example
Sales of 19-inch color television sets and 3-month
lagged unemployment are shown in the table below.
Determine if unemployment levels can be used to
predict demand for 19-inch color TVs and, if so,
derive a predictive equation.

Period 1 2 3 4 5 6 7 8 9 10 11
Units sold 20 41 17 35 25 31 38 50 15 19 14
Unemployment % 7.2 4.0 7.3 5.5 6.8 6.0 5.4 3.6 8.4 7.0 9.0
(3-month lag)
Simple Linear Regression and
Correlation Example: Data Plot

10
9
8
Units sold, y

7
6
5
4
3
2
1
0
0 10 20 30 40 50 60
Level of unemployment (%), x
Simple Linear Regression and
Correlation Example: Solution
x y xy x^2 y^2 n xy   x  y 
b
7.2 20 144.0 51.8 400  
n  x 2   x 
2

4.0 41 164.0 16.0 1681 11(1750.8)  70.2(305)


  6.91
7.3 17 124.1 53.3 289 11(476.4)  70.2(70.2)

5.5 35 192.5 30.3 1225 a


 y  b x 
n
6.8 25 170.0 46.2 625 305  (6.91)(70.2)
  71.85
6.0 31 186.0 36.0 961 11
5.4 38 205.2 29.2 1444 Y  71.85  6.91X

3.6 50 180.0 13.0 2500 n xy   x  y 


r
8.4 15 126.0 70.6 225   2
 
n  x 2   x   n  y 2   y 
2

7.0 19 133.0 49.0 361 11(1750.8)  70.2(305)


  0.966
9.0 14 126.0 81.0 196 11(476.4)  (70.2) 2  11(9907)  (305) 2

70.2 305 1750.8 476.3 9907


Linear Regression
Assumptions
 No patterns such as cycles or trends should
be apparent.
 Deviations around the line should be normally
distributed.
 Predictions are best being made within the
range of observed values.
Linear Regression Usage
Guidelines
 Always plot the data to verify that a linear
relationship is appropriate.
 The data may be time-dependent. If patterns
appear, use analysis of time series or use
time as an independent variable as part of a
multiple regression analysis.
 A small correlation may imply that other
variables are important.
Linear Regression Summary
 Simple linear regression applies only to linear
relationship with one independent variable.
 One needs a considerable amount of data to
establish the relationship --- in practice, 20 or
more observations.
 All observations are weighted equally.
Forecast Accuracy
 Error - difference between actual value and
predicted value
 Mean absolute deviation (MAD)
 Average absolute error
 Mean squared error (MSE)
 Average of squared error
Forecast Accuracy:
MAD & MSE

 Actual forecast
MAD =
n

2
 (Actual forecast)
MSE =
n -1
Example 10 (page 97).
Forecast Control
It is necessary to monitor forecast errors to ensure
that the forecast is performing adequately over time.
This is generally accomplished by comparing
forecast errors to predefined values, or action limits.
Why Do We Need Forecast
Control?
 The omission of an important variable.
 Appearance of a new variable.
 A sudden or unexpected change in the
variable (causing by severe weather or other
nature phenomena, temporary shortage or
breakdown, catastrophe, or similar events).
 Being used incorrectly.
 Data being misinterpreted.
 Random variation.
Forecasting Control Methods
 Tracking Signal
 Control Chart
Forecast Control: Tracking
Signal

 d  yt 
tracking signal  t

MAD

MADt  MADt 1    dt  yt  MADt 1 


Forecast Control: Control
Chart
 The control chart sets the limits as multiples of
the squared root of MSE.
Forecast Control: Control
Chart

s  MSE
 For a normal distribution, 95% of the errors fall within
+/-2s, and approximately 99.7% of the errors fall
within +/-3s. Errors fall outside these limits should be
regarded as evidence that corrective action is needed.

Example 11 (Page 93).


Forecast Method Selection
 Most important  Need to consider
 Cost  Historical performance
 Accuracy  Ability to respond to
change

You might also like