Академический Документы
Профессиональный Документы
Культура Документы
8/11/2006
More Examples of Markov Chains
The President of the United States tells person A his or her in-
tention to run or not to run in the next election. Then A relays the
news to B, who in turn relays the message to C, and so forth, always
to some new person. We assume that there is a probability a that a
person will change the answer from yes to no when transmitting it to
the next person and a probability b that he or she will change it from
no to yes. We choose as states the message, either yes or no. The
transition matrix is then
µ yes no ¶
yes 1 − a a
P = .
no b 1−b
1
More Examples ...
H Y D
H .8 .2 0
P = Y .3 .4 .3 .
D .2 .1 .7
2
Absorbing Markov Chains
3
Example: Drunkard’s Walk
0 1 2 3 4
1/2 1/2 1/2
4
The Transition Matrix
0 1 2 3 4
0 1 0 0 0 0
1
1/2 0 1/2 0 0
P = 2 0 1/2 0 1/2 0
.
3 0 0 1/2 0 1/2
4 0 0 0 0 1
5
Canonical Form
TR. ABS.
TR. µ Q R ¶
P =
ABS. 0 I
6
• The first t states are transient and the last r states are absorbing.
7
(n)
• Recall that the entry pij of the matrix P n is the probability of
being in the state sj after n steps, when the chain is started in
state si.
• P n is of the form
TR. ABS.
TR. µ Qn
? ¶
Pn =
ABS. 0 I
8
Probability of Absorption
9
The Fundamental Matrix
10
Drunkard’s Walk example
1 2 3 0 4
1 0 1/2 0 1/2 0
2 1/2 0 1/2 0 0
3
0 1/2 0 0 1/2
.
P =
0 0 0 0 1 0
4 0 0 0 0 1
11
0 1/2 0
Q = 1/2 0 1/2 ,
0 1/2 0
and
1 −1/2 0
I − Q = −1/2 1 −1/2 .
0 −1/2 1
12
0 1/2 0
Q = 1/2 0 1/2 ,
0 1/2 0
and
1 −1/2 0
I − Q = −1/2 1 −1/2 .
0 −1/2 1
1 2 3
1 3/2 1 1/2
N = (I − Q)−1 = 2 1 2 1 .
3 1/2 1 3/2
12
Time to Absorption
t = Nc ,
13
Absorption Probabilities
B = NR ,
14
In the Drunkard’s Walk example
1 2 3
1 3/2 1 1/2
N= 2 1 2 1 .
3 1/2 1 3/2
3/2 1 1/2 1
t = Nc = 1 2 1 1
1/2 1 3/2 1
3
= 4 .
3
15
3/2 1 1/2 1/2 0
B = NR = 1 2 1 0 0
1/2 1 3/2 0 1/2
0 4
1 3/4 1/4
= 2 1/2 1/2 .
3 1/4 3/4
16