Professional Documents
Culture Documents
and their possible consequences, including chance event outcomes, resource costs, and
utility. It is one way to display an algorithm. Decision trees are commonly used in
operations research, specifically in decision analysis, to help identify a strategy most
likely to reach a goal. Another use of decision trees is as a descriptive means for
calculating conditional probabilities. When the decisions or consequences are modelled
by computational verb, then we call the decision tree a computational verb decision
tree[1].
General
In decision analysis, a "decision tree" — and the closely-related influence diagram — is
used as a visual and analytical decision support tool, where the expected values (or
expected utility) of competing alternatives are calculated.
Decision trees have traditionally been created manually, as the following example shows:
A decision Tree consists of 3 types of nodes:-
Drawn from left to right, a decision tree has only burst nodes (splitting paths) but no sink
nodes (converging paths). Therefore, used manually, they can grow very big and are then
often hard to draw fully by hand.
Analysis can take into account the decision maker's (e.g., the company's) preference or
utility function, for example:
The basic interpretation in this situation is that the company prefers B's risk and payoffs
under realistic risk preference coefficients (greater than $400K -- in that range of risk
aversion, the company would need to model a third strategy, "Neither A nor B").
Decision trees, influence diagrams, utility functions, and other decision analysis tools and
methods are taught to undergraduate students in schools of business, health economics,
and public health, and are examples of operations research or management science
methods.
[edit] Advantages
Amongst decision support tools, decision trees (and influence diagrams) have several
advantages:
Decision trees:
• Are simple to understand and interpret. People are able to understand decision
tree models after a brief explanation.
• Have value even with little hard data. Important insights can be generated
based on experts describing a situation (its alternatives, probabilities, and costs)
and their preferences for outcomes.
• Use a white box model. If a given result is provided by a model, the explanation
for the result is easily replicated by simple math.
• Can be combined with other decision techniques. The following example uses
Net Present Value calculations, PERT 3-point estimations (decision #1) and a
linear distribution of expected outcomes (decision #2):
[edit] Example
Decision trees can be used to optimize an investment portfolio. The following example
shows a portfolio of 7 investment options (projects). The organization has $10,000,000
available for the total investment. Bold lines mark the best selection 1, 3, 5, 6, and 7,
which will cost $9,750,000 and create a payoff of 16,175,000. All other combinations
would either exceed the budget or yield a lower payoff.[2]
A decision tree is a diagram that a decision maker can create to help select the best of
several alternative courses of action. The primary advantage of a decision tree is that it
assigns exact values to the outcomes of different actions, thus minimizing the ambiguity
of complicated decisions. Because they map out an applied, real-world logical process,
decision trees are particularly important to building "smart" computer applications like
expert systems. They are also used to help illustrate and assign monetary values to
alternative courses of action that management may take.
A decision tree emanates from a starting point at one end (usually at the top or on the left
side) through a series of branches, or nodes, until two or more final results are reached at
the opposite end. At least one of the branches leads to a decision fork or a chance fork.
The diagram may continue to branch as different options and chances are diagrammed.
Each branch is assigned an outcome and, if chance is involved, a probability of
occurrence.
EXAMPLE 1
A decision maker may determine that the chance of drilling an oil well that generates
$100,000 (outcome) is 25 percent (probability of occurrence). To solve the decision tree,
the decision maker begins at the right hand side of the diagram and works toward the
initial decision branch on the left. The value of different outcomes is derived by
multiplying the probability by the expected outcome; in this example, the value would be
$25,000 (0.25 x $100,000). The values of all the outcomes emanating from a chance fork
are combined to arrive at a total value for the chance fork. By continuing to work
backwards through the chance and decision forks, a value can eventually be assigned to
each of the alternatives emanating from the initial decision fork.
If the decision is to drill, there are several potential outcomes, including (1) a 10 percent
chance of getting $300,000 in profits from the oil; (2) a 20 percent chance of extracting
$200,000 in profits; (3) a 10 percent chance of wresting $100,000 in profits from the
well; and (4) a 60 percent chance that the well will be dry and post a loss of $100,000 in
drilling
Figure 1
Simple Decision Tree for an
Oil Drilling Investment
costs. Figure 1 shows the decision tree for this data. Multiplying the probability of each
outcome by its dollar value, and then combining the results, assigns an expected value to
the decision to drill of $20,000 in profits. Thus, the profit maximizing decision would be
to drill the well.
For the purposes of demonstration, suppose that the chance of hitting no oil was
increased from 60 percent to 70 percent, and the chance of gleaning $300,000 in profits
was reduced from ten percent to zero. In that case, the dollar value of the decision to drill
would fall to -$20,000. A profit-maximizing decision maker would then elect to not drill
the well. The effect of this relatively small change in the probability calculation
underscores decision trees' dependence on accurate information, which often may not be
available.
EXAMPLE 2
Unlike the hypothetical oil drilling decision, which essentially involved making one
choice based on the likelihood of several outcomes, many corporate decisions involve
making a more elaborate series of decisions. Consider a bank's decision of whether to
loan money to a consumer. A decision tree might be used in a few different ways to aid
this multistep process. In the simplest case, the tree might codify the bank's assessment
criteria for identifying qualified applicants. Figure 2 illustrates what such a criteria
process might look like in a decision tree. This simple tree requires the applicant to meet
certain standards (job stability, assets, cash flow) at each stage; it assumes that the
minimum standards are effective predictors of success.
FURTHER READING: