You are on page 1of 7

A decision tree is a decision support tool that uses a tree-like graph or model of decisions

and their possible consequences, including chance event outcomes, resource costs, and
utility. It is one way to display an algorithm. Decision trees are commonly used in
operations research, specifically in decision analysis, to help identify a strategy most
likely to reach a goal. Another use of decision trees is as a descriptive means for
calculating conditional probabilities. When the decisions or consequences are modelled
by computational verb, then we call the decision tree a computational verb decision
tree[1].

General
In decision analysis, a "decision tree" — and the closely-related influence diagram — is
used as a visual and analytical decision support tool, where the expected values (or
expected utility) of competing alternatives are calculated.

Decision trees have traditionally been created manually, as the following example shows:
A decision Tree consists of 3 types of nodes:-

1. Decision nodes - commonly represented by squares


2. Chance nodes - represented by circles
3. End nodes - represented by triangles

Drawn from left to right, a decision tree has only burst nodes (splitting paths) but no sink
nodes (converging paths). Therefore, used manually, they can grow very big and are then
often hard to draw fully by hand.

Analysis can take into account the decision maker's (e.g., the company's) preference or
utility function, for example:

The basic interpretation in this situation is that the company prefers B's risk and payoffs
under realistic risk preference coefficients (greater than $400K -- in that range of risk
aversion, the company would need to model a third strategy, "Neither A nor B").

[edit] Influence diagram


A decision tree can be represented more compactly as an influence diagram, focusing
attention on the issues and relationships between events.

[edit] Uses in teaching


This section requires expansion.

Decision trees, influence diagrams, utility functions, and other decision analysis tools and
methods are taught to undergraduate students in schools of business, health economics,
and public health, and are examples of operations research or management science
methods.

[edit] Advantages
Amongst decision support tools, decision trees (and influence diagrams) have several
advantages:

Decision trees:

• Are simple to understand and interpret. People are able to understand decision
tree models after a brief explanation.
• Have value even with little hard data. Important insights can be generated
based on experts describing a situation (its alternatives, probabilities, and costs)
and their preferences for outcomes.
• Use a white box model. If a given result is provided by a model, the explanation
for the result is easily replicated by simple math.
• Can be combined with other decision techniques. The following example uses
Net Present Value calculations, PERT 3-point estimations (decision #1) and a
linear distribution of expected outcomes (decision #2):
[edit] Example
Decision trees can be used to optimize an investment portfolio. The following example
shows a portfolio of 7 investment options (projects). The organization has $10,000,000
available for the total investment. Bold lines mark the best selection 1, 3, 5, 6, and 7,
which will cost $9,750,000 and create a payoff of 16,175,000. All other combinations
would either exceed the budget or yield a lower payoff.[2]
A decision tree is a diagram that a decision maker can create to help select the best of
several alternative courses of action. The primary advantage of a decision tree is that it
assigns exact values to the outcomes of different actions, thus minimizing the ambiguity
of complicated decisions. Because they map out an applied, real-world logical process,
decision trees are particularly important to building "smart" computer applications like
expert systems. They are also used to help illustrate and assign monetary values to
alternative courses of action that management may take.

A decision tree represents a choice or an outcome with a fork, or branch. Several


branches may extend from a single point, representing several different alternative
choices or outcomes. There are two types of forks: (1) a decision fork is a branch where
the decision maker can choose the outcome; and (2) a chance or event fork is a branch
where the outcome is controlled by chance or external forces. By convention, a decision
fork is designated in the diagram by a square, while a chance fork is usually represented
by a circle. It is the latter category of data, when associated with a probability estimate,
that makes decision trees useful tools for quantitative analysis of business problems.

A decision tree emanates from a starting point at one end (usually at the top or on the left
side) through a series of branches, or nodes, until two or more final results are reached at
the opposite end. At least one of the branches leads to a decision fork or a chance fork.
The diagram may continue to branch as different options and chances are diagrammed.
Each branch is assigned an outcome and, if chance is involved, a probability of
occurrence.

EXAMPLE 1
A decision maker may determine that the chance of drilling an oil well that generates
$100,000 (outcome) is 25 percent (probability of occurrence). To solve the decision tree,
the decision maker begins at the right hand side of the diagram and works toward the
initial decision branch on the left. The value of different outcomes is derived by
multiplying the probability by the expected outcome; in this example, the value would be
$25,000 (0.25 x $100,000). The values of all the outcomes emanating from a chance fork
are combined to arrive at a total value for the chance fork. By continuing to work
backwards through the chance and decision forks, a value can eventually be assigned to
each of the alternatives emanating from the initial decision fork.

In the rudimentary example below, a company is trying to determine whether or not to


drill an oil well. If it decides not to drill the well, no money will be made or lost.
Therefore, the value of the decision not to drill can immediately be assigned a sum of
zero dollars.

If the decision is to drill, there are several potential outcomes, including (1) a 10 percent
chance of getting $300,000 in profits from the oil; (2) a 20 percent chance of extracting
$200,000 in profits; (3) a 10 percent chance of wresting $100,000 in profits from the
well; and (4) a 60 percent chance that the well will be dry and post a loss of $100,000 in
drilling
Figure 1
Simple Decision Tree for an
Oil Drilling Investment
costs. Figure 1 shows the decision tree for this data. Multiplying the probability of each
outcome by its dollar value, and then combining the results, assigns an expected value to
the decision to drill of $20,000 in profits. Thus, the profit maximizing decision would be
to drill the well.

For the purposes of demonstration, suppose that the chance of hitting no oil was
increased from 60 percent to 70 percent, and the chance of gleaning $300,000 in profits
was reduced from ten percent to zero. In that case, the dollar value of the decision to drill
would fall to -$20,000. A profit-maximizing decision maker would then elect to not drill
the well. The effect of this relatively small change in the probability calculation
underscores decision trees' dependence on accurate information, which often may not be
available.

EXAMPLE 2
Unlike the hypothetical oil drilling decision, which essentially involved making one
choice based on the likelihood of several outcomes, many corporate decisions involve
making a more elaborate series of decisions. Consider a bank's decision of whether to
loan money to a consumer. A decision tree might be used in a few different ways to aid
this multistep process. In the simplest case, the tree might codify the bank's assessment
criteria for identifying qualified applicants. Figure 2 illustrates what such a criteria
process might look like in a decision tree. This simple tree requires the applicant to meet
certain standards (job stability, assets, cash flow) at each stage; it assumes that the
minimum standards are effective predictors of success.

A more sophisticated decision tree, possibly implemented through a custom software


application, might use data from past borrowers—both reliable and unreliable—to predict
the overall likelihood that the applicant is worth the risk. Such a system would allow
greater flexibility to identify applicants who may not appear to be strong candidates on
one or two criteria, but are statistically likely to be reliable borrowers, and therefore,
profitable investments for the bank. Theoretically, this system could identify borrowers
who don't meet the traditional, minimum-standard criteria, but who are statistically less
likely to default than applicants who do meet all of the minimum
Figure 2
Simple Decision Tree for a Loan Application Process
requirements. An elaboration of the idea behind the oil drilling example, this decision
tree would assign probabilities of success or failure based on applicant data. For example,
when assessing job stability, it might assign a series of probabilities based on the actual
number of years the individual has been employed, rather than simply asserting a
minimum cut-off point. The tree might also interpret the number of years employed
differently based on other data about the applicant, e.g., holding a job for two years might
mean something different if the applicant is young and has only been in the work force
for a short time as opposed to an applicant who has been in the work force for decades
and has recently changed jobs. The particular criteria and probabilities assigned to these
pieces of information would, of course, be driven by a detailed analysis of data from
previous borrowers.

FURTHER READING:

Read more: Decision Tree http://www.referenceforbusiness.com/encyclopedia/Cos-


Des/Decision-Tree.html#ixzz123f5TLJP

You might also like