Professional Documents
Culture Documents
Paper 0066V1-2
Contents
1. Analysing the risks in construction projects
2. SERC recommendations
2.1 Contract strategy
2.2 Risk assessment
2.3 Risk allowance
2.4 Risk management
3. Risk allocation
4. Bearing the risk
5. Methods of dealing with risk
5.1 The qualitative risk analysis stage
5.2 The quantitative risk analysis stage
5.3 Sensitivity and probability analyses
5.4 Decision trees
6. Simulation
6.1 Monte Carlo simulation
6.2 Probabilty Density Distribution (PDD)
This paper was written by Roger Waterhouse MSc FCIOB MIMgt MAPM MSIB
Some argue that there is a difference between risk and uncertainty; others however
maintain that the difference is unimportant. Those who hold the former viewpoint
believe that risk involves an assessment based upon historical data or experience. A
decision is made based upon the probability of a particular event occurring; in other
words, a forecast is made using past data and therefore within some degree of
certainty. On the other hand uncertainty is when no historical data or experience
exists.
2 SERC recommendations
According to a SERC report, Risk Management in Engineering
Construction (1986), some of the main recommendations for dealing with risks are
as follows:
All too often risk is either ignored or dealt with in an arbitrary way: simply
adding a 5 percent contingency on to the estimated cost of a project is
typical.
The greatest uncertainty is during the early stages of a project, which is also
when decisions of greatest impact are made. Risk must be assessed and
allowed for at this stage.
The clients departments and advisers should operate as a single team to avoid
the institutional risk of incomplete commitment and inconsistent decisions.
Flexibility in project design and the risk of later changes should be considered
in detail before completing proposals for sanctioning.
Risk techniques are widely used in other industries. The techniques are now
well within the reach of small companies, requiring only a microcomputer to
be put into action.
The analysis should be carried out by those trained to do so jointly with project
planners and cost estimators.
Delay in completion can be the greatest cause of extra cost, inconvenience and
of loss of financial return. The first estimate of cost benefits should be based
on a realistic programme for a project, so that the potential effects of delays
can be predicted realistically.
For high-risk contracts, project sponsors should specify the allocation of risk
when inviting bids and require tenderers to state their provision for risk (by
way of available resources) in their bids. Project sponsors should also consider
selecting the contractor on the basis of minimum acceptable risk rather than
lowest price. Risk analysis allows such a criterion to be used.
Clients and all parties involved in construction projects and contracts benefit
greatly from reduction in uncertainty prior to their financial commitment.
Money spent early buys more than money spent late. Willingness to invest in
anticipating risk is a test of a clients wish for a successful project.
Much can be learned about the implications and management of project risk
without extensive numerical analysis. Risk analysis is essentially a brainstorming process of compiling realistic forecasts and answers to what happens
if? questions.
In conclusion, risk identification involves indicating all potential risks the project
might face and assessing their impact and probability of occurrence to decide which
risks need to be managed.
No.
Event
Possible
effect
Lack of
compound
Disruption
during
investigations
Absence of
nightwatchman
Delay to
project
Store left
unlocked
1
Theft of
materials
Rise in insurance
premiums
Reduction in
morale
on site
Labour and
plant idle
Reduction in
workload
Valuable items
Poor procedures
Small size
Lack of
maintenance
Concrete sets in
pipe
Delay to
programme
Lack of cleaning
Concrete too
stiff
Aggregate too
high
Lift too high
Increased cost to
contract
2
Concrete
pump failure
Increase in cost
to contractor
Need to replace
pump
Remove
concrete
already placed
Additional stop
end
required in pour
3 Risk allocation
It is clear, therefore, that risks and their effects should be considered throughout the
project by all the parties involved in the project.
In general, the risk should be allocated to these parties on a solid basis of
responsibility and control in a manner likely to optimise project performance. The
burden of responsibility for the allocation of risks should rest with the project
manager as the clients representative although this will depend to a certain extent
upon his delegated authority.
Classification of sources of risk may be made under the following headings: design,
construction, environmental, financial, legal, operational, political, and physical.
Ideally, the allocation of risk should be done by the client through the contract
documents. However, this does mean that a full analysis of all identifiable risks
should be executed at the very start of a contract. Some might argue this is not
altogether feasible with modern fast-track projects; nevertheless, the principle
remains true that as much of the analysis as possible should be carried out as early as
possible.
The allocation of risk will depend on the type and conditions of contract.
Traditionally, it is the main contractor to whom the largest part of the risk is
allocated. However, perhaps the greater concern is whether the client and/or
consultants have ascribed responsibilities to the contractor which are not properly
within his scope or control. If so, it will invariably result in tension between the
parties.
During this stage all risk is assessed in terms of its probability and magnitude or size.
Quite often the number of risks that have a high probability of occurrence and a high
impact is not great in construction projects. However, sometimes there is a long list of
the high-probability, low-impact risks to be considered and this can make the risk
analysis difficult. On the other hand, sometimes the number of risks which make a
majority of the total risk to be considered is not great. For example, consideration of
seven large risks might cover as much as 85 percent of the total. This makes the risk
analysis easier and enables the contractor or analyst to concentrate his attention on
these relatively few critical sources of risk.
The allocation of risks should be based on a thorough appraisal of the relationship
between the respective party and risk. Incentives and risk go together. A party that
carries a risk has the incentive to minimise its impact. The basis for allocation of risks
should be:
z
z
z
Whatever principle governs the allocation of risks between contractual parties, there
is always a danger that this allocation has not been done properly, and therefore a risk
whose allocation is not clear can occur and cause disputes. The probability of dispute
between the parties reduces proportionally with the reduction in the number of
unallocated risks.
The traditional method for allocating risks in the industry of construction and real
estate is as follows:
z
It is often possible to reduce risks, and this should be attempted before the allocation
is made. On most projects, certain risks can be reduced at relatively little cost. The
right time to consider possibilities of risk avoidance or reduction is the early stage of
a project. For example, the risk associated with uncertain ground conditions is always
high. It can be reduced by drilling a large number of exploratory boreholes.
It is also possible to transfer the risk by contracting to another party, whether by
insurance or by its inclusion within the implementation contract (see Figure 4).
FIGURE 3 Risk categories
INVESTMENT
RISKS
DEVELOPMENT
POLITICAL
Financial
Funding
Funding collapse
see Investment
Project abandonment
Delay to project
Cost increase
Interest rate changes
Commercial
Viability of
development
Market changes
National economy
Design
Buildability
Errors
Changes
Design choice
Stability
Construction
Programme
Planning
Design changes
Site investigation
Material shortages
Safety
Cost increase
Plant breakdown
New techniques
Resource shortages
Statutory authorities
Penalties
Town planning
Weather
Operational
Maintenance
Environmental
Safety
Energy
Location
National & EU
Taxation
Legislation
Environmental
Grants
Planning
Governmental stability
International
Joint ventures
Governmental
Consortium
Foreign aid
Labour movement
Controls
Governmental stability
DISASTER
Explosion
Earthquake
Flood
Rain
Lightning
Snow/ice
Subsidence
Fire
Hurricane
If the party can cover a risk by insurance, and it is reasonable for the risk to be
dealt with in this way.
If the risk is of loss due to the partys own misconduct or lack of reasonable
care.
To continue to bear the risk and manage it for profit, but accept liabilities.
If vulnerable, to try to recover costs from other parties, including the client.
Moderate
(1000
10,000)
Major
(10,000
100,000)
Critical
(over
100,000)
Most
Improbable
Accept
Insure
Insure
Insure
Improbable
(rare)
Accept
Insure
Insure
Insure
Possible
Accept
Insure
Insure
Change
Method/
Design
Probable
Partial
Insure
Insure
Insure
Change
Method/
Design
Most
Probable
(common)
Partial
Insure
Insure
Insure
Change
Method/
Design
Probability
By directing the user to select his preferred contract strategy, which has the
effect of drawing attention to the differences in the allocation of risk between
the respective strategies.
The process involves compiling a list of the main risk sources with a description of
their likely consequences:
z
Determine possible solutions via brainstorming sessions with the project team.
This process not only helps to examine the potential problem areas with a project, but
it also brings considerable benefits in terms of understanding the project. It helps to
focus the respective minds of the project team members by provoking thought about
the management responses to the risks. A good understanding between the team
members is also a useful side effect.
Assessments of cost and time improve as the project proceeds, but it should always be
appreciated that the most significant decisions are made during the early stages of a
project. Therefore a realistic estimate of the final cost and project duration is required
as early in the project as possible. It is at this stage that all the potential uncertainties
and risks likely to affect the project should be identified and hopefully fully assessed.
This will also encourage the project manager and his project team to concentrate on
strategies for controlling the risks as well as determining the allocation of risk to the
respective parties. Furthermore, it will help identify what additional design and
resources are most likely needed.
One method for considering project risks is to analyse any risk independently of
others, with no attempt to estimate the probability of occurrence of that risk. The
estimated effects of each risk can then be accumulated to provide maximum and
minimum project outcome values. In other words neither a subjective nor an
analytical value is given regarding the probability of occurrence of the risk event.
Instead, each risk event is compounded to determine the possible effect upon the
project and then, by applying a range of maximum (critical) to minimum (minor see
Figure 4) project outcomes (consequences), the full extent of the particular risk can be
seen.
Sensitivity analysis
This is a technique used to consider the effect upon the project of changes to those
events which are deemed a potential risk to the project. A sensitivity analysis requires
calculation of the effects on the project for a range of values of the event changes.
The effect on the project (project outcome) is usually expressed in terms of NPV,
IRR, time or final cost.
For example, one of the risks on a project is the cost of steel and there is a risk that
this could increase by 2, 5, 10 or 15 percent. In this case the project outcome is
evaluated for each of these potential cost changes and the result can then be plotted
on a graph to show the percentage variation (Risk v Cost change).
The results of a sensitivity analysis can be shown graphically on a spider
diagram (see Figure 6). This example is based upon analysing the possible costs to
the contractor of a contract to construct a motorway. The diagram shows the results of
calculating the sensitivity of the cost to changes in each risk which could affect
productivity on site. For instance, it indicates the effect upon the overall project cost
of a decrease in drain-laying output (increase in duration variable, raises costs).
A sensitivity analysis is very useful because often the effect of a small change in one
variable (a cost or a duration, for example) produces a marked difference in the
project outcome. When several risks are being assessed in this way, a spider
diagram provides an effective way of demonstrating risks which are most critical and
sensitive. These are the ones the project manager must act upon.
Such an analysis can be performed for all the risks and uncertainties which may affect
a project in order to identify those which have a large impact on the cost, time or
whatever the objectives are. This procedure may be used to identify the variables to
be considered for carrying out a probability analysis see below.
One problem with a sensitivity analysis, however, is that each risk is considered
independently with no attempt made to quantify their probabilities of occurrence.
This procedure is also limited because in reality a variable would not change without
other project factors changing and this is not shown in the sensitivity analysis.
Eventually, when the user has gained sufficient practice, the number of risks in need
of consideration can be reduced because those which have a large impact on the
project tend to become more easily identifiable.
Probability analysis
This is a technique which can extend beyond the limitations of a sensitivity analysis
by specifying a probability distribution for each risk and then assessing the effects on
the risk events in total. However, careful interpretation of the results is essential. One
important stage in this type of risk analysis is assessing the range of probabilities
which could result.
The use of Monte Carlo simulation (random sampling) can be made where
calculation of data inserted into an equation would be difficult or impossible. This
procedure may be used in a probability analysis as follows:
1. The variations to the risks being considered are assessed and a suitable
probability distribution of each risk is selected; then
2. For each risk a value within its specified range is selected. The value should be
randomly chosen and be within the estimated probability distribution.
3. To establish the outcome for the project, a calculation is made based on the
combined values for each risk.
4. The process in (3) is repeated several times in order to produce the probability
distribution of the project outcome.
Note that, by convention, a square node depicts a choice to be made, whilst circular
nodes correspond to chance events.
To shed light on the practical problem posed, one method is to evaluate directly the
expected value of the profit involved at each node. This criterion says that the
expected value, or worth, of a profit of x that materialises with probability p is (px).
Thus, the expected value at each circular node is the sum of all the (px) quantities.
Note that summing just the probabilities p together out of any one chance node has to
equal 1, since precisely one branch must occur. (Recall the probability of a certain
event is, by definition, 1.) To find expected values at each square node, it is necessary
to subtract the costs associated with the particular branch. It is straightforward, by
working backwards exhaustively through the entire decision tree to find the most
appropriate decision at each node simply by choosing the branch that maximises
expected profit at every step. This enables the best overall strategy to be
implemented, or at least to give a more solid basis on which to make decisions than
purely by intuition alone!
EXAMPLE (continued)
Figure 8 shows the expected values inserted into the decision tree. The uppermost circular
node has expected value 8250k, derived from 12m*0.25 + 7m*0.75. The secondmost
chronological decision node (should warehouse be expanded later or not?) has expected
value of 8000k, being the larger of 8m*0.5 + 6m*0.5 = 7m, if not expanding, and
14m*0.5 + 10m*0.5 + (4m) = 8m, if choosing to expand.
Continuing, the first chronological decision can be seen to be between expected profits of
1250k for building the large warehouse immediately versus 2250k for the more
cautious wait and see option.
Hence, application of the decision tree method strongly supports building the smaller
warehouse for now in this example.
There are some obvious limitations to this technique. Supremely, used naively, one
has to put faith in the accuracy of all the probability figures and cost estimates. If any
of these happen to be seriously wrong, so too might be the recommended decision(s).
So in practice how is this surmounted?
The answer is to conduct a sensitivity analysis, similar to that described in risk
analysis above, especially if there are grounds for reasonable uncertainty in the
figures given in the decision tree. Such grounds may be because not all values are
based on well-researched, historic data. Clearly, if costs and probabilities are no more
than wild guesses, the method, used simplistically, has little more to offer than blind
intuition. Nevertheless, the method comes into its own when large and complex
decisions are broken into a series of smaller ones, any of which can be subjected to
minor changes in associated numeric values to assess its overall impact. For instance,
in the above example, one could consider best and worst case scenarios and see what
difference may be made to the initial decision about size of warehouse.
If fortuitous, it may turn out to be the case that the identical initial decision is
recommended regardless of quite major changes to probabilities and/or expected
profits further down the tree. Even if not, one can ask questions like if all but the
(say) 14m figure were unchanged from Figure 7, how large or small would it need to
be in order to maintain the overall best choice of building a smaller warehouse to
begin with? Equally, one might pose the question in terms of how likely it must be,
instead of the presumed 0.25 probability, for demand to be increasing sufficient to
justify building initially the larger warehouse (again, all other things besides the
0.75 complementary probability being kept equal).
Another important consideration in the use of this technique is the appropriateness or
otherwise of the expected value criterion. As a rule of thumb, the more complex the
tree, the better this criterion becomes. A typical structure of a complex tree is
illustrated in Figure 9. The reason why this rule holds is because, with small trees
involving just a handful of decision nodes and probabilistic outcomes, there is not
really scope for expected value to have direct meaning. Again, referring to the above
example, one could say that building the large warehouse now gives rise to either a
5m net profit or breaking even (these being, respectively, 12m 7m, and
7m 7m). Having simplified the state of affairs to just, in effect, high and low
demand, neither outcome would yield the implied 1250k. It may be helpful to recall
analogously that one difference between a sample mean and a median is that the
former may not be an attainable value (eg who has 2.4 children?) whereas, in general,
the latter measure will be.
As a further use of decision trees, one can employ them to make judgements
involving other criteria besides expected value. Specifically, it may be in the best
interests to use a criterion that minimises the worst loss, or perhaps more
speculatively, maximises the highest profit. Such criteria are called minimin,
maximin or minimax, etc and there are circumstances when it is more prudent to
apply such a criterion in place of expected value. However, as indicated, for more
complex decision trees, it is usually sensible to use expected value, as in the longer
run, overestimates and underestimates of expected values tend to balance out,
assuming importantly there is no systematic bias in the allocation of costs and
probabilities.
In summary, decision trees can provide a powerful technique for convincing oneself,
and others, about the most suitable course of action when faced with uncertainty.
This graphical representation sometimes makes the solution obvious and, by the
addition of estimated costs, values of outcomes and probabilities, provides a basis for
analysing complex problems. Hence this system can help clarify and communicate
the option available to the project manager.
6 Simulation
6.1 Monte Carlo simulation
Building and analysing a simulation model will be more effective if a substantial
bank of data exists or can be generated. Monte Carlo simulation makes use of random
numbers to generate random data based on known facts or observations.
Tables of random numbers, as typically found in any statistics textbook, are produced
by computer using various algorithms which ensure that all numbers from 00 to 99
(usually) have equal probability of occurring at any point on the tables. When using
random number tables it is important not to introduce bias: the numbers should be
used strictly in order as shown in the examples.
Throughout this section, just a single iteration of the simulation process is described.
In practice, a computer would perform literally hundreds or thousands of such
iterations, and produce a summary of all the results. This summary is quite likely to
reflect reality, provided of course initial assumptions in the simulation model are
valid!
Any simulation model should have a factual basis from which probabilities can be
determined. Random numbers used for the generation of data are allocated in
accordance with the probabilities.
EXAMPLE 1
Consider a stock control situation where the demand for a particular product varies from
day to day. Observations show that the demand per day recorded over a period of 100
days is as follows:
Demand per day (items)
0
1
2
3
4
5
6
Number of days
2
8
22
34
18
9
7
Number of days
2
8
22
34
18
9
7
Probability
.02
.08
.22
.34
.18
.09
.07
Random numbers are allocated in accordance with the probabilities. For each
probability point, one random number is allocated. Thus since the probability of a
demand of zero items per day is 0.02 then two random numbers must be allocated,
namely 0001. The next eight numbers are allocated to a demand of one item per day
0209, and so on.
The allocation of random numbers may be simplified by considering the cumulative
probabilities as follows:
Demand per day
0
1
2
3
4
5
6
Probability
.02
.08
.22
.34
.18
.09
.07
Cumulative probability
.02
.10
.32
.66
.84
.93
1.00
Random numbers
0001
0209
1031
3265
6683
8492
9399
Note the numerical relationship between the cumulative probabilities and the
allocated random numbers.
Using the probability density distribution
Data which simulates the demand per day is generated by reading random numbers
from the tables and looking up the relative demand in the PDD. For example:
Random number
84
28
64
49
06
75
09
73
In order to simulate the stock control problem, a second PDD is required for the
supply or delivery of goods. The recorded delivery times following placing an order,
over 50 deliveries, are as follows:
Time to delivery (days)
2
3
4
5
Total
Frequency
13
35
17
5
70
Probability
.19
.50
.24
.07
Cumulative
probability
(13/70)
.19
.69
.93
1.00
Random
numbers
0018
1968
6992
9399
Using the demand and delivery PDDs shown above, run a Monte Carlo simulation
over a period of 15 days for a stock control situation where the buffer stocks (level at
which an order is placed) are 10, initial stock is 15, and the re-order quantity is 20.
Day
number
0
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
Demand
random number
84
64
49
06
75
09
73
49
64
39
89
77
86
Demand
5
3
3
1
4
1
4
3
4
3
5
4
5
Stock
Delivery
level
random number
15
10
28
7
4
23
19
18
14
11
7
93
4
out of stock
out of stock
out of stock
16
11
159
Days
Day
number
14
We could, of course, add in the various costs associated with the ordering and storage
of stock, and by varying buffer stock level, re-order quantity etc a number of
simulations could be carried out to optimise these values.
Queuing models can also be investigated using simulation and decisions regarding the
number of service points to be installed can be arrived at economically.
EXAMPLE 2
Aggregates are delivered by rail to a large civil engineering contractor. The trains are
believed to arrive according to a well-known standard probability distribution called
Poisson, with a mean of 1.25 per day. Two unloading services are available for a
maximum of seven hours per day each and the unloading frequencies applicable to both
are shown below:
Unloading time per train (hours)
5
6
7
8
9
Frequency
15
23
36
41
25
a. Use Monte Carlo simulation to predict the arrivals pattern and utilisation of the
unloading facilities over a 10-day period.
b. Is a further unloading point justified?
Using a standard formula (that is beyond the scope of this course) for the arrival
probabilities, the following PDDs can be drawn up:
Arrivals PDD
Arrivals per day
0
1
2
3
4
5
Cumulative Poisson
probability*
0.29
0.64
0.87
0.96
0.99
1.00
Random
number allocation
0028
2963
6486
8795
9698
99
Frequency
probability
15
23
36
41
25
140
Cumulative
allocation
0.11
0.27
0.53
0.82
1.00
Random number
0010
1126
2752
5381
8299
(Continued)
EXAMPLE 2 (continued)
Simulation, using a particular sequence of random numbers gives:
Day
number
Random
number
Arrivals
per day
1
2
3
56
39
69
1
1
2
4
5
11
80
0
2
6
7
8
9
10
11
19
03
51
01
0
0
0
1
0
Utilisation factor =
Random
number
Unloading
time
77
73
96
89
49
36
77
(hours)
8
8
9
9
7
7
Totals
Utilisation of
unloading facilities
No 1
No 2
(hours)
(hours)
7
1
7
7
1
2
7
2
7
32
24
EXAMPLE 3
Question:
A digital simulation model is required for forecasting the likely financial performance of a
company offering a specialist technical service to the building industry. The available data
are given in Tables 1 and 2 below.
a. Simulate a set of 12 monthly sales income values. Use the random number
sequence 3, 47, 43, 73, 86, 36, 96, 47, 36, 61, 46, 98 for this purpose.
b. Simulate a set of 12 monthly cost values. The random number sequence 16, 22, 77,
94, 39, 49, 54, 43, 54, 82, 17, 37 is to be used.
c. Define the remaining steps in the sequence of calculations which must be
performed in order to complete the simulation.
d. Comment on the apparent situation of the company and make suggestions for
improvement.
DATA
Table 1
Gross monthly
income (000s)
6
8
10
12
14
16
18
20
22
24
Table 2
Total monthly
cost (000s)
10
12
14
16
18
20
22
Observed
frequency
1
4
6
3
2
1
2
1
7
3
Observed
frequency
1
7
6
5
4
4
3
Answer:
PDD Gross Monthly income
Income
Frequency
Probability
6
8
10
12
14
16
18
20
22
24
1
4
6
3
2
1
2
1
7
3
30
.03
.13
.20
.10
.07
.03
.07
.03
.23
.10
1.00
Cumulative
probability
.03
.17
.37
.47
.53
.57
.63
.67
.90
1.00
Random
number
002
316
1736
3746
4752
5356
5762
6366
6789
9099
(Continued)
EXAMPLE 3 (continued)
PDD Monthly Cost
Cost
Frequency
Probability
10
12
14
16
18
20
22
1
7
6
5
4
4
3
30
.03
.23
.20
.17
.13
.13
.10
1.00
Sales
Cumulative
probability
.03
.27
.47
.63
.77
.90
1.00
Cost
Month
Profit/Loss
RN
1
2
3
4
5
6
7
8
9
10
11
12
Random
number
002
326
2746
4762
6376
7789
9099
03
47
43
73
86
36
96
47
36
61
46
98
RN
8
14
12
22
22
10
24
14
10
18
12
24
16
22
77
94
39
49
54
43
54
82
17
37
12
12
20
22
14
16
16
14
16
20
12
14
4
2
8
0
8
6
8
0
6
2
0
10
=+2
This completes parts (a) and (b). For (c), you need to describe how this process would be
repeated over and over again, by computer, using, naturally, different random number
sequences every time. Then, summarised results would, it is argued, reflect reality quite
well as long as initial model assumptions are deemed plausible and sensible.
Finally, for (d), some hypothetical discussion can be given based on saying, Suppose
the average of many simulations turned out to show, as in (c), a small profit of 2000 per
annum for the company . . ..