You are on page 1of 5

Master Forecast Profile

Master Forecast profile defines the necessary parameters to control the forecast in Demand
Planning. The number of profiles will depend upon the level of aggregation in which you will
forecast, as well as how many different forecast models will be utilized to accurately predict
demand.

Master profiles are assigned to planning areas. Multiple Forecast profile can be defined to
interactively generate forecasts based on historical data showing constant, trend or seasonal
patterns. To generate forecast for different key figures, several master profiles are needed for
each of the key figures. The master profile contains:

Period indicator : The period indicator defines the time buckets profile for the forecast. This
period indicator must be a time characteristic that has actual data.
Life cycle planning indicator:This is relevant for lifecycle planning and like modeling.
Forecast horizon: This includes the start and end dates of the time period for which forecast
would be generated. Enter a start date with either an end date or the number of periods. If you
do not enter a start date, you must enter the number of periods. The system then uses the
current date as the start of the planning horizon.
History Horizon: This includes the start and end dates of the past period, the actual data of
which is to be used to create the forecast.
Statistical forecast profiles: You assign an univariate forecast profile, a multiple linear
regression (MLR) profile, and a composite profile to the master forecast profile.

Univariate forecasting models are models that investigate historical data according to constant,
trend, and seasonal patterns, and issue forecast errors accordingly. Univariate models are often
called time series models.

Causal analysis is based on causal factors, such as prices, budgets, and campaigns. The
system uses MLR to calculate the influence of causal factors on past sales. This enables you to
analyze the success of specific actions. The calculated connection between causal factors and
past sales is then used as a basis for modeling future actions.

Composite Forecasting is used to combine multiple forecasts into one final forecast.

Univariate models are often called time series models.
The time series models develop forecasts by assessing the patterns and trends
of past sales. As a result, the key determinant in the selection of a time series
model is the pattern of previous sales data. The general assumption is that future
sales will mimic past sales. Past sales patterns are identified and reproduced in
the forecast. After the pattern is identified, the forecaster can select the time
series model that is best suited to that particular pattern. For example, if past
sales have had seasonal influences, the Winters' classical decomposition method,
which compensates for seasonality should be used. For example, if the sales are
consistently high in October and April, the classical decomposition method is
appropriate for the purpose. If past sales have small fluctuations and no major
pattern or trend, then a smoothing model may be appropriate, such as the moving
average model or the exponential constant model.
The feature that is common to all the time series methods is that they are
endogenous. This means that a time series model only considers the pattern of
previous actual sales or a series of sales over a certain period of time. If the
patterns can be identified and projected into the future, they can be used to create a
forecast. Time series models are the most commonly used forecasting methods.

You can manually assign a model to the profile by entering the model number in
the univariate profile. You can also use different methods of automatic model
assignment. For some of these methods, mathematical tests are used to determine
the nest model or to minimize the error.
Automatic model selection: You can allow the system to select the most suitable
forecast model. To select the most appropriate model, the system analyzes
historical data. If the system cannot detect any clear time series patterns in the
historical data, it automatically selects the constant model.

Procedure 1: If you want the system to select the forecast model, you can choose
between various statistical tests and test combinations to determine the model. If
you choose procedure 1, you have to set a forecast strategy between 50 and 55
in the univariate forecast profile. The historical data governs the strategy you
choose and the test made by the system.
In the trend test, the system performs a regression analysis of the historical values
and checks if there is a significant trend pattern.
In the seasonal test, the system clears the historical values of any possible trends
and then performs an autocorrelation test.
Procedure 2: The system calculates the models to be tested using various
combinations of alpha, beta, and gamma. As of SCM 4.1, you can set up the
increments and the variation area of the smoothing factors. Following that, the
system selects the model that has the least error measure of your choice. In the
background planning run it can generate clear forecast profiles with an assignment
to the selection. Procedure 2 is more precise than procedure 1, but it takes more
time. If you want to use procedure 2, set forecast strategy, 56, in the time series

forecast profile (univariate profile). With this procedure, you can, for example,
have the assignments and forecast settings calculated anew every six months. In
the meantime, you can use these generated profiles in a high-performance way.
Note: In release 4.0, the follow changes have been made to forecast
strategy 56:
. theError Measure field has been added to the model parameters
of the univariate forecast profile if automatic model selection 2
is selected. This allows a (customer-specific) error measure to be
selected for choosing the best model.
. In release 4.0, The Variation Length field has been added to the
model parameters of the univariate forecast profile if automatic
model selection 2 is selected. This test considers the specified length
and the variation.
. A sporadic history test has been added. This test assigns the Croston
method if the ratio of historical periods with a 0 demand > 66%.
. A test for white noise has been added. This test assigns a constant
method
You can also integrate own procedures in SCM-DP with user exist APOPR001
using function module EXIT_/SAPAPO/SAPLMCPR_001. For this, you can set
up strategy 99 in the forecast profile.

The system uses the alpha factor to smooth the basic value, the beta value to
smooth the trend value, and the gamma value to smooth the seasonal value. The
smoothing factors give a higher weighting to the more recent historical values
than to the less recent ones, which means that the more recent values have a larger
influence on the forecast.
Constant models determine the basic value of future sales data, trend models
determine the basic and trend values, seasonal models determine the basic and
seasonal values, and trend and seasonal models determine the basic, trend, and
seasonal values.
The formula in the above graphic is used for exponential smoothing of the basic
value using the alpha factor. The formula includes:
B(t): The basic value for the current period, t.
B(t-1) is the basic value from the previous period, t-1.
V(t) is the actual requirement, (version 000), from period, t.
The ex-post forecast uses the smoothing factors from historical data to determine
the basic, trend, and seasonal values. If the ex-post forecast is accurate in
predicting the historical data, the forecast error will be small.

The smoothing factor governs a forecast's reaction time to a change in the pattern.
If you choose 0 for the alpha value, the new average equals the old one. In this
case, the basic value calculated previously does not change. This means that the
forecast does not react to the current data. If you choose 1 for the alpha value, the
new average equals the last value in the time series.
The most common values for alpha are between 0.1 and 0.5.

Example: An Alpha value of 0.5 weights historical values, as follows:
First historical value: 50%, second historical value: 25%
Third historical value: 12.5%, and fourth historical value: 6.25%
The default alpha factor is 0.3, beta is 0.3, and gamma is 0.3.
Several profiles are available in order to manipulate the weighting put on history.
In the univariant profile, users can use a weighting profile. This profile contains
weighting factors, in the form of percentages, for some or all of the historical
values on which the forecast is based. Otherwise, the field will remain blank.
The trend dampening profile allows you to account for a decrease in trend as the
market reaches maturiy. The system cannot predict such a decrease on the basis of
historical data. This profile specifies, per period, the percentage by which you want
the trend to be dampened. If you are using a trend or a seasonal trend model and
wish the forecasted trend pattern to be dampened, enter a trend dampening profile
The selected time series profile provides a set of selected historical values that are
applied to the historical input key figure. The markers refer to historical values
that should be taken into account in outlier correction. They are stored in the
database as a time series. If the marked values lie outside the tolerance lane, the
system corrects them before carrying out the forecast.

To generate more exact forecasts, you must remove the impact of one-time
promotions or delivery problems from the actual data. The adjustments are
normally made in the corrected history key figure instead of the original key figure.
master forecast profile.

During each forecast run, a corrected history can be created by modifying the
original historical values read from the InfoCube/liveCache. These changes may
be brought about, for example, by outlier correction, workday corrections, phase
in/out profiles and the like. This corrected history can then be used for the actual
forecast. The data from the corrected history may be saved if you have defined
which key figure for the forecast key figure in the planning area is the corrected
history and if this key figure exists in the planning book. You can access this
data for the following forecasts.
The normal process of a forecast is as such: read historical data create corrected
history execute forecast with the corrected history. You can set the indicator 'Read
corrected data history from plng version'. This means that the system reads the
historical values from the key figure which is defined as the corrected history of
the forecast key figure. These historical values can be seen in interactive planning
in the additional row Original corrected history. You can continue to change and
later store these when creating the new corrected history.
In the forecast profile, you can define whether you want the forecast to be based
on original actual data or corrected actual data.
Automatic adjustment of corrected history has the following uses:
. If no forecasts have been made for a long time, you can use automatic
adjustment of corrected history key figure.
. For periods where there is no corrected history in the database, the system
uses the original history. It is July 1 and you have not created any forecasts
over the past two months. The corrected history from the previous two
months contains manual corrections that you do not want to lose. For this
reason, your corrected history does not contain values for May or June. If
the system were to base the forecast on the historical values, the forecast
results would be inaccurate because of the zero values from May and June.
To prevent such inaccuracies, you set the automatic adjustment of corrected
history key figure. The system then uses the original historical values for
May and June.

You can select outlier correction in the univariate forecast profile to automatically
correct outliers in the historical data on which the forecast is based. If you have set
the outlier correction '' indicator, then the system first calculates an expost forecast
with the selected forecasting technique. In the next step, the system calculates a
tolerance threshold T where T is calculated as T = sigma * MAD. Here, the final
MAD is always used for all historical values V. This MAD is the MAD from the
last historical period. All historical values V, which violate the condition (expost -
T) < V < (expost +T) are set to the value of the expost forecast. Then the system
carries out the final expost forecast + forecast with the corrected historical values.
The values which were determined in the process for the expost forecast could
of course deviate from the original values. This explains a deviation between
the values corrected by the outlier correction and those displayed by the expost
forecast.
The sigma factor defines the width of the tolerance range for automatic outlier
correction. It defines the permissible number of standard deviations. A small
sigma factor means a low tolerance, and a large number of outliers that are
detected and corrected. The default sigma factor is 1.25. If you set the sigma
factor yourself, SAP recommends that you set it to between 0.6 and 2.

You might also like