You are on page 1of 9

Unit 2

Management of Conversion System


Chapter 3: Forecasting
Lesson 7:- Causal Forecasting Models: Regression Analysis

Learning Objectives

After reading this lesson you will understand


Causal forecasting models
Linear regression analysis
Multiple regression analysis
Monitoring and controlling forecasts

Good Morning students, today we are going to introduce the concept of what is
known as the Causal Forecasting Models. Friends, by now we have progressed
much in the journey of learning forecasting. Before coming end to this journey we
need to learn about causal forecasting, linear and multiple regression analysis, and
how to monitor and control forecasted values.
But I’m getting carried away.
First thing first.
Let us first start with causal forecasting model.

Causal forecasting models


Causal forecasting models usually consider several variables that are related to the
variable being predicted. It utilizes time series related to the variable being forecast in an
effort to better explain the cause of a time series behaviour. For example, sales of a
product depend on various factors.
Some of these are listed below:
Firm’s advertisement budget
The price charged
Competitor’s prices
Promotional strategies
Regression analysis is the tool most often used in developing these causal models.
The complete discussion of regression analysis is beyond the scope of this class. We will
discuss a few underlying concepts to get insight about how this technique is used. First of
all, according to the logic and methodology of regression analysis, the time series value
that we want to forecast is referred to as the dependent variable. The variables that we try
to relate to the dependent variable are referred to as independent variables, and the
function that is developed to relate the dependent variable to the independent variables is
called the estimated regression function. Thus if we can identify a good set of
independent or predictor variables, we may be able to develop an estimated regression
function for predicting or forecasting the time series.

Linear regression analysis


The most common quantitative causal forecasting model is linear regression analysis. By
linear, we mean an equation of degree 1.The equation given below fulfills this criterion:

^
Y = a + bx
Where,
^
Y = value of the dependent variable
a = y – axis intercept
b = slope of the regression line
x = independent variable

Multiple Regression Analysis


This is the next step in regression analysis and is considered to be an improvement over
the linear regression method discussed above.
The equation in this case can be represented as:

^
Y = a + b1x1 + b2x2
The construction of regression line will be clearer by taking an example.
Example
Symphony is a construction company that builds offices in Delhi. Over time, the
company has found that its rupee volume of renovation work is dependent on the Delhi
area payroll. The following table lists Symphony’s revenues and the amount of money
earned by wage earners in Delhi during the years 1991 – 1996.
Symphony’s sales (Rs in thousands), y Local payroll (Rs in lakhs), x
2.0 1
3.0 3
2.5 4
2.0 2
2.0 1
3.5 7

Symphony’s management wants to establish a mathematical relationship that will help it


predict sales. First, they need to determine whether there is a straight-line (linear)
relationship between area payroll and sales, so they plot the known data on a scatter
diagram.

Fig. 3.3 Scatter diagram

It appears from the six data points that a slight positive relationship exists between the
independent variable, payroll, and the dependent variable, sales. As payroll increases,
Symphony’s sales tend to be higher. We can find a mathematical equation by using the
least squares regression approach.

Sales Payroll x2 xy
y x
2.0 1 1 2.0
3.0 3 9 9.0
2.5 4 16 10.0
2.0 2 4 4.0
2.0 1 1 2.0
3.5 7 49 24.5
∑y = 15.0 ∑x = 18 ∑x2 = 80 ∑xy = 51.5


X=
∑X =3
n

Y=
∑Y = 2.5
n
b = .25
a = 1.75

The estimated regression equation, therefore, is:


Y = 1.75 + .25x
Or
Sales = 1.75 + .25 payroll
If the prediction is that Delhi area payroll will be Rs600 lakhs next year, we can estimate
sales for Symphony with the regression equation:
Sales (in thousands) = 1.75 + .25(6)
= 3.25
i.e. Sales = Rs3250
The final part of this example illustrates a central weakness of causal forecasting like
regression. Even though we have computed a regression equation, it is necessary to
provide a forecast of the independent variable x – in this case, payroll – before estimating
the dependent variable y for the next time period.

To measure the accuracy of the regression estimates we need to compute the standard
error of the estimate, Sy,x. This is called the standard deviation of the regression. It is
expressed as

∑(y − y )
2
c
Sy,x =
n−2
Where
y = y-value of each data point
yc = the computed value of the dependent variable, from the regression equation
n = the number of data points

The next issue in today’s agenda, coming up now:-


Monitoring and controlling forecasts
The monitoring and controlling forecasts determine the extent of success of business
forecasting. It helps the manager to focus on the adequacy or redundancy (as the case
may be) of the forecasting estimates and hence makes remedial/corrective measures
possible

Tracking signal
A tracking signal assumes great significance in this regard. It is a measurement of how
well the forecast is predicting actual values.

The tracking signal is computed as the running sum of the forecast errors (RSFE) divided
by the mean absolute deviation (MAD).

Tracking signal = RSFE / MAD


= ∑(Actual demand in period I – Forecast demand in period I) / MAD
where
MAD = ∑ |Forecast errors| / n

Positive tracking signals indicate that demand is greater than forecast. Negative signals
mean that demand is less than forecast. A good tracking signal, that is, one with a low
RSFE, has about as much positive bias as it has negative bias. In other words, small
biases are okay, but the positive and negative ones should balance one another so the
tracking signal centers closely on zero bias.

+ Upper control limit

0 MADs Acceptable range

- Lower control limit

Time

Tracking signal

(If tracking signal lies within +6 and –6 the forecast could be considered to be
acceptable)

To make it clear we take an example


Example Quarterly sales are given. Calculate tracking signal
Quarter Forecast Actual Error RSFE |FE| MAD Tracking
Sales sales signal

1 100 90 -10 -10 10 10.0 -1


2 100 95 -5 -15 5 7.5 -2
3 100 115 +15 0 15 10.0 0
4 110 100 -10 -10 10 10.0 -1
5 110 125 +15 +5 15 11.0 +.5
6 110 140 +30 +35 30 14.2 +2.5

MAD = ∑ |Forecast Errors|/n


= 85/6 = 14.2

Tracking signal = RSFE/ MAD


= 35/14.2 = 2.46

Having discussed quite a few concepts, now let us apply these in actual practice.

POM in practice - A short range forecasting system*


The company considered in this case was a producer of residential and light commercial
air-conditioners and heating units. With the recession of 1974 – 75, the sudden decline in
housing starts caused current forecasts to be very optimistic. The company had lost faith
in their computer approach and was relying mostly on judgment. The old system used a
12-month moving average adjusted by an appropriate seasonal factor that was obtained
from the previous 3 years’ data. Four-month forecasts were necessary to accommodate
manufacturer’s lead times.
Analysis of this existing system showed that its major problems were an inability to
respond quickly to sudden changes in demand and total lack of any forward-looking
procedure. Consequently, a new system was proposed. This was composed of two
independent segments: an objective forecasting system based on historical data and
projections of certain economic variables and a subjective forecast generated by
judgments of regional field managers. Theses were combined into a final forecast as
shown I the figure

Forward-looking
Objective forecast
Using regression

Objective
Forecast

Backward-looking
Objective forecast
Using adaptive
Smoothing
Final forecast

Subjective
Field forecasts

Fig Proposed forecasting system


In order to adapt to sudden changes in demand, an adaptive smoothing technique was
used. The adaptive smoothing method provided the basis for a backward-looking forecast
relying purely on historical data for the heating and air-conditioning company.
The forward-looking objective forecast was based on regression analysis. From a list of
potentially influential economic indicators, the following list was selected using
statistical analysis:

1. Heating units
Private housing starts.
Private investment-residential structures.
Private, nonfarm, single-family housing starts.
2. Cooling units
Total gross private domestic investment (PDIC).
Private housing starts (HST).
Government purchases of goods and services (GVTC).

A typical regression equation that resulted was


Deseasonalised cooling units = -57,725.5 + 292.058 (PDIC) + 7822.676 (HST) +
216.535 (GVTC)

Five regional managers for their respective districts generated subjective forecasts. These
were combined with the two objective forecasts in order to obtain the final forecast.
Comparisons of actual orders versus the regional forecasts were made to track the
performance of the system and to provide feedback to the regional managers.
The new method, developed with close cooperation of the company’s personnel,
resulted in more accurate forecasts than were previously generated, and plans for using
this method on a regular basis were instituted.

With that, we have come to the end of today’s discussions. I hope it has been an
enriching and satisfying experience. See you around in the next lecture. Take care.
Bye.

You might also like