Вы находитесь на странице: 1из 39

Analytical Decision Modeling for Business

Mohammed Seid, Dr. Addis Ababa University, Department of Management


Slide 1

Learning Objectives

Evolution Structuring the decision problem and decision trees Types of decision making environments: Decision making under uncertainty when probabilities are not known Decision making under risk when probabilities are known Expected Value of Perfect Information Decision Analysis with Sample Information Developing a Decision Strategy Expected Value of Sample Information
Slide 2

Modeling Process

Requires following activities: Formulation, Defining the decision variables, objective variables, function and constraints Solution, Requires determining the optimal values of the decision variables and the objective function. For example by computer software Interpretation To interpret the results

Slide 3

Types Of Decision Making Environments

Type 1: Decision Making under Certainty. Certainty. Decision maker know for sure (that is, with certainty) outcome or consequence of every decision alternative. The working method is as follows: Select the alternative with largest profit or smallest cost by using an appropriate mathematical model.

Slide 4

Types Of Decision Making Environments

Type 2: Decision Making under Uncertainty. Decision maker has no information at all about various outcomes or states of nature. Base experience and judgment. No single best policy

Slide 5

Types Of Decision Making Environments


Type 3: Decision Making under Risk. Risk. Decision maker has some knowledge regarding probability of occurrence of each outcome or state of nature.

Slide 6

Elements of Decision Analysis

Although decision making under uncertainty occurs in a wide variety of contexts, all problems have three common elements: 1. The set of decisions (or strategies) (Action) (Action) 2. The state of nature (set of possible outcomes) and the probabilities of these outcomes, and 3. A value model that prescribes monetary values for the various decision-outcome combinations. decisionGiven these->optimal decision->optimality criterion. thesedecision-

Payoffs
Slide 7

List Possible Actions or Events

Two Methods of Listing

Payoff Table

Decision Tree

Slide 8

Payoff Table

Consider a food vendor determining whether to sell soft drinks or hot dogs.
Course of Action (Aj) Sell Soft Drinks (A1) Sell Hot Dogs (A2)

Event (Ei) Cool Weather (E1) Warm Weather (E2)

x11 =$50 x21 = $200

x12 = $100 x22 = $125

Slide 9

Decision Trees

A decision tree is a chronological representation of the decision problem. Composed of nodes (circles, squares, and triangles) and branches (lines).

Slide 10

Decision Trees Continued

Time proceeds from left to right. Any branches leading right. into a node (from the left) have already occurred. Probabilities are listed on probability branches. These probabilities are conditional on the events that have already been observed (those to the left). Monetary values are shown to the right of the end nodes. EMVs are calculated through a folding-back process. folding1. At each probability node, calculate an EMV- a sum EMVof products of monetary values and probabilities 2. At each decision node, take a maximum of EMVs to identify the optional decisions
Slide 11

Decision Tree: Example

Food Vendor Profit Tree Diagram


x11 = $50

x21 = $200

x12 = $100

x22 =$125
Slide 12

Payoff Table Do Some Actions Dominate?

Event (Ei) Level of Demand Low Moderate High

Course of Action (Aj) Production Process A B C 70 80 100 120 120 125 200 180 160

D 100 120 150

Slide 13

Decision Making Under Uncertainty

If the decision maker does not know with certainty which state of nature will occur, then he/she is said to be making decision under uncertainty. uncertainty. The five commonly used criteria for decision making under uncertainty are: 1. the optimistic approach (Maximax) Maximax) 2. the conservative approach (Maximin) Maximin) 3. the minimax regret approach (Minimax regret) 4. Equally likely (Laplace criterion) 5. Criterion of realism with (Hurwicz criterion)

Slide 14

Optimistic Approach

The decision with the largest possible payoff is chosen. If the payoff table was in terms of costs, the decision with the lowest cost would be chosen.

Slide 15

Conservative Approach

Minimum possible payoff is maximized.) maximized.) Maximum possible cost is minimized.) minimized.)

Slide 16

Minimax Regret Approach

The minimax regret approach requires the construction of a regret table or an opportunity loss table. table. For each state of nature Use this regret table Minimum of the maximum regrets. regrets.

Slide 17

Example: Marketing Strategy


Consider the following problem with two decision alternatives (d1 & d2) and two states of nature S1 (Market Receptive) and S2 (Market Unfavorable) with the following payoff table representing profits ( $1000): States of Nature s1 s3

d1 Decisions d2

20 25

6 3

Slide 18

Example: Optimistic Approach, (Maximax Criterion)


Choose the decision that has the largest single value in the payoff table. Maximum Decision Payoff d1 20 choose d2 d2 25 maximum

Slide 19

Example: Conservative Approach, Criterion of Pessimism (Maximin Criterion)


List the minimum payoff for each decision. decision. Choose the decision with the maximum of these minimum payoffs. Minimum Decision Payoff choose d1 d1 d2 6 3 maximum

Slide 20

Example: Minimax Regret Approach (Savage Criterion Criterion)


Compute a regret table by subtracting each payoff in a column from the largest payoff in that column (for state of nature). The resulting regret table is:

s1
d1 d2 5 0

s2
0 3

Maximum

5 3

minimum

Then, select the decision with minimum regret.

Slide 21

Example: Equally Likely (Laplace) Criterion Equally likely, also called Laplace, criterion finds decision alternative with highest average payoff.

Average for d1 = (20 + 6)/2 = 13 Average for d2 = (25 + 3)/2 = 14 Thus, d2 is selected

Slide 22

Example: Criterion of Realism

Often called weighted average, the criterion of realism average, decision criterion is a compromise between optimistic and a pessimistic decision. decision. h = (maximum pay-off) + (1- ) (minimum pay-off)

Slide 23

Example: Criterion of Realism


Select that action as the optimum action which corresponds to the maximum of the above determined value of h.

In our example let = 0.8 Payoff for d1 = 0.8*20+0.2*6=17.2 Payoff for d2 = 0.8*25+0.2*3=20.6 Thus, select d2

Slide 24

DECISION MAKING UNDER RISK


The

probabilistic decision situation The chances of occurrence (probabilities) of each state of nature are known or can be estimated.

Slide 25

Decision Criteria

Expected monetary value (EMV) The expected profit for taking an action Aj Expected opportunity loss (EOL) The expected loss for taking action Aj Expected value of perfect information (EVPI) The expected opportunity loss from the best decision

Slide 26

Decision Making with Probabilities

Expected Value Approach If probabilistic information regarding the states of nature is available (EMW) Sum product of the payoff each state of nature and respective probabilities Best expected return is chosen.

Slide 27

Expected Value of a Decision Alternative

The expected value (EV) of decision alternative di is defined as:


N

EV( d i ) P( s j )Vij
j 1

where:

N = the number of states of nature P(sj) = the probability of state of nature sj Vij = the payoff corresponding to decision alternative di and state of nature sj

Slide 28

Example: Marketing Strategy

Expected Value Approach Refer to the previous problem. Assume the probability of the market being receptive is known to be 0.75. Use the expected monetary value criterion to determine the optimal decision. State of Nature
S1 (Mkt Receptive) S2 (Mkt Unfavorable)

Highest EMV=Better Alternative

EMV

Decisions

d1 d2 Probabilities

20 25 0.75

6 3 0.25

16.5 19.5

Slide 29

Expected Opportunity Loss Criterion (EOL) The Expected Opportunity Loss (EOL) => Pay-off lost due to failure of adopting the best action Difference between the highest profit for a state of nature and the actual profit of that action.

Opportunity Loss: Table

State of Nature

State of Nature
EOL 3.75 0.75

Decisions d1
2

S1 (Mkt S2 (Mkt S1 (Mkt S2 (Mkt Receptive) Unfavorable) Receptive) Unfavorable) Profit of Optimal Action 25 6 20 6 5 0 25 3 0 3 Probabilities 0.75 0.25

Slide 30

Expected Opportunity Loss Criterion (EOL)


For Food weather Example with p=.5

Opportunity Loss Table


Cool Warm Cool Warm Weather Weather EMV Weather Weather EMV Optimal Action 100 200 Sell Soft 50 200 125 50 0 25 Decision Drink Sell Hot Dogs 100 125 112.5 0 75 37.5 Probabilities 0.5 0.5 0.5 0.5

Slide 31

Expected Value of Perfect Information (EVPI) and Expected Pay-off of Perfect Information
(EVPI) represents the maximum amount of money which a decision maker can afford Provides an upper bound on the expected value of any sample or survey information. information.

Slide 32

Expected Value of Perfect Information (EVPI) & Expected Pay-off of Perfect Information (EPPI)
Expected

Pay-off of Perfect Information (EPPI) represents the highest expected profit in the presence of the perfect predictor.
If

the decision maker had perfect information he would pick the decision alternative that would result the maximum payoff

EVPI = EPPI - EMV* or EVPI = EOL*

Slide 33

Example: Marketing Strategy

Expected Value of Perfect Information Calculate the expected value for the best action for each state of nature and subtract the EV of the optimal decision.
State of Nature
S2 (Mkt S1 (Mkt Unfavorabl EMV Receptive) e) d1 d2

EVPI= EVPI= .75(25,000) + .25(6,000) - 19,500 = $750


Decisions S1 (Mkt Receptive) Best Action d2 Pay-off Probabilities S2 (Mkt Unfavorable) d1 25 6 0.75 0.25
Probabilities

20 25 0.75

6 16.5 3 19.5 0.25

EPPI
20.25

EMV
19.5

EVPI
0.75

Opportunity Loss: Table

State of Nature

State of Nature
EOL 3.75 0.75

Decisions d1
2

S1 (Mkt S2 (Mkt S1 (Mkt S2 (Mkt Receptive) Unfavorable) Receptive) Unfavorable) Profit of Optimal Action 25 6 20 6 5 0 25 3 0 3 Probabilities 0.75 0.25

Slide 34

Decision Analysis With Sample Information

Knowledge of sample or survey information can be used to revise the probability estimates for the states of nature. Employing Bayes' Theorem, prior probabilities Bayes' Theorem, are revised to get posterior probabilities. probabilities.

Slide 35

Expected Value of Sample Information

The expected value of sample information (EVSI) is the additional expected profit possible through knowledge of the sample or survey information.

Slide 36

Efficiency of Sample Information

As the EVPI provides an upper bound for the EVSI, efficiency is always a number between 0 and 1.

Slide 37

Refer to the Marketing Strategy Example

It is known from past experience that of all the cases when the market was receptive, a research company predicted it in 90 percent of the cases. (In the other 10 percent, they predicted an unfavorable market). Also, of all the cases when the market proved to be unfavorable, the research company predicted it correctly in 85 percent of the cases. (In the other 15 percent of the cases, they predicted it incorrectly.) Answer the following questions based on the above information.

Slide 38

Example: Marketing Strategy


1. Draw a complete probability tree. 2. Find the posterior probabilities of all states of nature. 3. Using the posterior probabilities, which plan would you recommend? 4. How much should one be willing to pay (maximum) for the research survey? That is, compute the expected value of sample information (EVSI).

Slide 39

Вам также может понравиться