Академический Документы
Профессиональный Документы
Культура Документы
MM5009
Tugas Resume
outcomes may be very large or even infinite. For example, we might approximate a market share
distribution with just three outcomes: high, medium and low. A number of methods for making
this sort of approximation have been suggested, and we will discuss the Extended Pearson-Tukey
(EP-T) approximation here. This was proposed by Keefer and Bodily,4 who found it to be a very
good approximation to a wide range of continuous distributions. The method is based on earlier
work by Pearson and Tukey5 and requires three estimates to be made by the decision maker:
(i) The value in the distribution which has a 95% chance of being exceeded. This value is
allocated a probability of 0.185.
(ii) The value in the distribution which has a 50% chance of being exceeded. This value is
allocated a probability of 0.63.
(iii) The value in the distribution which has only a 5% chance of being exceeded. This value is
also allocated a probability of 0.185.
Practical applications of decision trees
Ulvila6 used the technique to help the US postal service to decide on whether to continue with
the nine-digit zip code for business users. The analysis was designed to compare the monetary
returns which might result from the use of various types of automatic sorting equipment either
with or without the code.
The EP-T approximation was used to represent probability distributions of the extent to which
the code would be used, and the savings which would result from the various options in the tree.
The author reported that the approach helped the decision makers to think creatively about the
problem and to generate options.
Cohan et al.7 used decision trees to analyze decisions relating to the use of fire in forest
management. Nevertheless, if a decision is made to postpone a fire this itself will result in costs
caused by the delay in meeting the foresters objectives. The authors reported that the use of
decision analysis helped forest managers to understand the implications and relative importance
of the key uncertainties involved in the decisions. Moreover, it provided clear and concise
documentation of the rationale underlying important decisions. They argued that this can be
invaluable in stimulating discussion and communication within a forest management
organization. As we have seen, decision trees are the major analytical structures underlying
application of decision analysis to problems involving uncertainty.
Assessment of decision structure
Consider the following real-life decision problem that we would like you to attempt to
It is really a matter of the decision analysts judgment as to whether the elicited tree is a fair
representation of the decision makers decision problem. Once a structure is agreed then the
computation of expected utility is fairly straightforward. Structuring is therefore a major problem
in decision analysis, for if the structuring is wrong then it is a necessary consequence that
assessments of utilities and probabilities may be inappropriate and the expected utility
computations may be invalid.
Stages 1 and 2 of the decision process are iterative, the structure of the decision problem emerges
from discussions between the decision maker and the analyst. Once a structure for the decision
representation has been agreed and probabilities and utilities are elicited (stage 3) the expected
utility for the various acts under consideration can be computed (stage 4) and the act which has
the maximal expected utility is chosen (stage 5).
What determines the decision analysts provisional representation of the decision problem?
Generally, it will be based upon past experience with similar classes of decision problems and, to
a significant extent, intuition
Many decision makers report that they feel the process of problem representation is perhaps
more important than the subsequent computations. However, some studies have illustrated that
the decision makers estimates, judgment and choices are affected by the way knowledge is
elicited.
This research has direct relevance for the decision analysts attempts at structuring. In one study,
Fischhoff et al.17 investigated estimation of failure probabilities in decision problem
representations called fault trees. These fault trees are essentially similar to decision trees, with
the exception that events rather than acts and events are represented.
However, focusing subjects attention on what was missing only partially improved their
awareness. Fischhoff labeled this insensitivity to the incompleteness of the fault tree out of
sight, out of mind. The finding was confirmed with technical experts and garage mechanics
Another finding from the study was that the perceived importance of a particular sub-event or
branch of the fault tree was increased by presenting it in pieces (i.e. as two separate branches).
The implications of this result are far reaching. Decision trees constructed early in the
analyst/decision maker interaction may be incomplete representations of the decision problem
facing the decision maker.
Eliciting decision tree representations
What methods have been developed to help elicit decision tree representations from decision
makers? One major method, much favored by some decision analysts, is that of influence
diagrams18 which are designed to summarize the dependencies that are seen to exist among
events and acts within a decision
The advantage of starting with influence diagrams is that their graphic representation is more
appealing to the intuition of decision makers who may be unfamiliar with decision technologies.
Decision trees, because of their strict temporal ordering of acts and events, need completely
respecifying when additional acts and events are inserted into preliminary representations
We shall illustrate the applicability of influence diagrams through a worked example. First,
however, we will present the basic concepts and representations underlying the approach.
As with the decision tree, event nodes are represented by circles and decision nodes by squares.
Arrowed lines between nodes indicate the influence of one node to another.
One step-by-step procedure for turning an influence diagram into a decision tree is as follows:
(1) Identify a node with no arrows pointing into it (since there can be no loops at least one node
will be such).
(2) If there is a choice between a decision node and an event node, choose the decision node.
(3) Place the node at the beginning of the tree and remove the node from the influence diagram.
(4) For the now-reduced diagram, choose another node with no arrows pointing into it. If there is
a choice a decision node should be chosen.
(5) Place this node next in the tree and remove it from the influence diagram.
(6) Repeat the above procedure until all the nodes have been removed
from the influence diagram.
Very complex decision trees can be represented as one-page influence diagrams. However, the
use of influence diagrams to construct decision trees where subsequent events and acts depend
on the initial decision (i.e. where the resulting decision tree is asymmetric) is more problematic.
In these instances, the influence diagram approach to decision tree structuring can be used as a
guide only.
Bayes theorem
In Bayes theorem an initial probability estimate is known as a prior probability. When Bayes
theorem is used to modify a prior probability in the light of new information the result is known
as a posterior probability.
The steps in the process which we have just applied are summa- rized below:
(1) Construct a tree with branches representing all the possible events which can occur and write
the prior probabilities for these events on the branches.
(2) Extend the tree by attaching to each branch a new branch which represents the new
information which you have obtained. On each branch write the conditional probability of
obtaining this information given the circumstance represented by the preceding branch.
(3) Obtain the joint probabilities by multiplying each prior probability by the conditional
probability which follows it on the tree.
(4) Sum the joint probabilities.
(5) Divide the appropriate joint probability by the sum of the joint
probabilities to obtain the required posterior probability.
The effect of new information on the revision of probability judgments
It is interesting to explore the relative influence which prior probabilities and new information
have on the resulting posterior probabilities.
Applying Bayes theorem to a decision problem
Assessing the value of new information
New information can remove or reduce the uncertainty involved in a decision and thereby
increase the expected payoff. However, in many circumstances it may be expensive to obtain
information since it might involve, for example, the use of scientific tests, the engagement of the
services of a consultant or the need to carry out a market research survey.
If this is the case, then the question arises as to whether it is worth obtaining the information in
the first place or, if there are several potential sources of information, which one is to be
preferred (sometimes the process of determining whether it is worth obtaining new information
is referred to as preposterior analysis). To show how this question can be answered, we will first
consider the case where the information is perfectly reliable (i.e. it is certain to give a correct
indication) and then look at the much more common situation where the reliability of the
information is imperfect.
The expected value of perfect information
In many decision situations it is not possible to obtain perfectly reliable information, but
nevertheless the concept of the expected value of perfect information (EVPI) can still be useful.
The expected value of imperfect information
A summary of the main stages in the above analysis is given below:
(1) Determine the course of action which would be chosen using only the prior probabilities and
record the expected payoff of this course of action;
(2) Identifythepossibleindicationswhichthenewinformationcangive;
The analysis acted as an initial screening process, leading to the early rejection of a number of
projects and enabling the corporation to focus only on those which had a good potential for
generating economic gains. Because the managerial and other inputs to the model were soft,
Schell stressed the importance of sensitivity analysis. This involved the calculation of the
expected value of perfect information and also the variation of the inputs into the model.
In this chapter we have discussed the role that new information can play in revising the
judgments of a decision maker. We argued that Bayes theorem shows the decision maker how
his or her judgments should be modified in the light of new information, and we showed that this
revi- sion will depend both upon the vagueness of the prior judgment and the reliability of the
new information. Of course, receiving information is often a sequential process. Your prior
probability will reflect the informa- tion you have received up to the point in time when you
make your initial probability assessment. As each new instalment of information arrives, you
may continue to revise your probability. The posterior probability you had at the end of last week
may be treated as the prior probability this week, and be revised in the light of this weeks
information.