Вы находитесь на странице: 1из 8

Solution to Problems 2, Markov Chains

Problem 1 Find the state transition matrix P for the Markov chain
1 2 1 2 0 1 2 1 2 1 4

1 2

1 4

Solution 1 The state transition matrix is 01


2 B1 @2 1 4 1 2 1 2 1 4

0
1 2

P =

C 0A :

Problem 2 Each second, a laptop computers wireless LAN card reports the state of the s radio channel to an access point. The channel may be (0) poor, (1) fair, (2) good, or (3) excellent. In the poor state, the next state is equally likely to be poor or fair. In states 1; 2; and 3; there is a probability 0:9 that the next system state will be unchanged from the previous state and a probability 0:04 that the next system state will be poor. In states 1 and 2; there is a probability 0:6 that the next state is one step up in quality. When the channel is excellent, the next state is either good with probability 0:04 or fair with probability 0:02: Sketch the Markov chain and nd the state transition matrix P: Solution 2 The chain is shown below

and the transition matrix is B C B0:04 0:9 0:06 0 C B C P=B 0:04 0 0:9 0:06C @ A 0:04 0:02 0:04 0:9 Telecommunication Networks 2005 0 0:5 0:5 0 0 1

Solution to Problems 2, Markov Chains

Problem 3 The random variables

1 ; 2 ; :::

are independent and with the common probability


2 ; :::; n g

mass function. Set X0 = 0; and let Xn = max f 1 ; k= Pr ( = k) = 0 0:1 1 0:3

be the largest

observed to

date. Sketch the Markov chain, and determine the transition probability matrix. 2 0:2 3 0:4

Solution 3 the chain is shown


0.4

0.1

0.4

0.6
0.3 0.2

0.4 2 3

0.2

0.4

0 1 0:1 0:3 0:2 0:4 B C B 0 0:4 0:2 0:4C B C P=B 0 0 0:6 0:4C @ A 0 0 0 1

Telecommunication Networks 2005

Solution to Problems 2, Markov Chains

Problem 4 A 3-state Markov chain fXn ; n = 0; 1; ::g has the transition probability matrix B C P = @ 0:2 0:2 0:6 A 0:6 0:1 0:3 1. Compute the two-step transition matrix P2 2. What is Pr fX3 = 1jX1 = 0g 3. What is Pr fX3 = 1jX0 = 0g 0 0:1 0:2 0:7 1

Solution 4 1. 0 12 0 1

B C B C P(2) = @ 0:2 0:2 0:6 A = @ 0:42 0:14 0:44 A 0:6 0:1 0:3 0:26 0:17 0:57 2. We have Pr fX3 = 1jX1 = 0g = P01
(2)

0:1 0:2 0:7

0:47 0:13

0:4

which can be obtained directly from the two-step transition matrix Pr fX3 = 1jX1 = 0g = P01 = 0:13 3. We have Pr fX3 = 1jX0 = 0g = P01 Using the Chapman-Kolmogorov equation, we obtain
(3) P01 (3) (2)

= 0:47

2 X k=0

P0k Pk1 = P00 P01 + P01 P11 + P02 P21 0:2 + 0:13 0:2 + 0:4 0:1 = 0:16

(2)

(2)

(2)

(2)

Telecommunication Networks 2005

Solution to Problems 2, Markov Chains

Problem 5 Each morning an individual leaves his house and goes for a run. He is equally likely to leave either from his front of back door. Upon leaving the house, he chooses a pair of running shoes (or goes running barefoot if there are not shoes at the door from which he departed). On his return he is equally likely to enter, and leave his running shoes, either by the front or back door. If he owns a total of k pairs of running shoes, what proportion of the time does he run barefooted?

Markov chain.

Solution 5 Let Xn be the number of shoes at the front door. Then Xn 2 f0; 1; 2; :::; kg is a

0.5

0 .5

0.5

0.5 0 .5 0.5 0.5

0.5
0 .5

0.5

0.5 0

0.5

2
0.5

K-1

0 .5

0.5

0 .5

0 .5

0 .5

0. 5

Writing the balance equations 1 2 1 1 = 2 ::: = ::: 1 1 k 1 = 2 2


0

1 2 1 2

1 2

) ) )

0 1

= =

1 2

k 1

Therefore, we have
0

= ::::

k 1

we obtain after using the normalization equation


0

+
1

+ ::::

k 1

+
k

=1 1 k+1

= ::::

k 1

Therefore the proportion of the time does he run barefooted is


0

1 k+1

Telecommunication Networks 2005

Solution to Problems 2, Markov Chains

Problem 6 An organization has N employees where N is a large number. Each employee has one of three possible job classications and changes classications (independently) according to a Markov chain with transition probabilities 0 C B @0:2 0:6 0:2A 0:1 0:4 0:5 What percentage of employees are in each classication. 0:7 0:2 0:1 1

Solution 6 The limiting probabilities can be obtained by solving any two equations from the set
0 1 2

= 0:7 = 0:2 = 0:1

0 0 0

+ 0:2 + 0:6 + 0:2

1 1 1

+ 0:1 + 0:4 + 0:5

2 2 2

together with the normalization equation


0

=1

We obtain
0 1 2

= = =

6 17 7 17 4 17

Telecommunication Networks 2005

Solution to Problems 2, Markov Chains

pose that when the chain is in state i; i > 0; the next state is equally likely to be any of the states 0; 1; 2; ::; i 1: Find the limiting probabilities of this Markov chain.

Problem 7 Consider a Markov chain with states f0; 1; 2; 3; 4g : Suppose P0;4 = 1; and sup-

Solution 7 The state transition diagram is


1

1 3 1 3
1

1 2 1 1 2 2

1 3 3 1 4 1 4

1 4

1 4

and the transition matrix

The limiting probabilities are obtained by solving the equations


0 1 2 3 4

B B1 B P = B1 B2 B1 @3
1 4

0 0
1 2 1 3 1 4

0 0 0
1 3 1 4

1 0 1 C 0 0C C 0 0C C C 0 0A 1 4 0

= = = = =

+
2

1 2

+
3 4

1 3

+
4

1 4

1 2 1 3 1 4
0

1 3 1 3+ 4 +
4

1 4

together with the normalization equation


0

=1

Telecommunication Networks 2005

Solution to Problems 2, Markov Chains

We obtain
4 3

= = = =

1 4 1 3 1 2

) +

1 4

1 4 1 2+ 3
3

) + 1 4

1 11 1 + 0 = 0 34 4 3 11 11 1 ) 1= + + 23 34 4 12 37

1 2

therefore
0

1+

1 1 1 + + +1 2 3 4

= 1 =)

and we get
0 1 2 3 4

= = = = =

12 37 6 37 4 37 3 37 12 37

Telecommunication Networks 2005

Solution to Problems 2, Markov Chains

Problem 8 Two teams A and B, are to play a best of seven series of games. Suppose that the outcomes of successive games are independent, and each is won by A with probability p and won by B with probability 1 p: Let the state of the system be represented by the pair (a; b), where a is the number of games won by A, and b is the number of games won by B: Specify the transition probability matrix, and draw the state-transition diagram. Note that a+b 7 and that the series ends whenever a = 4 or b = 4:

Solution 8 The state transition diagram is


1 p 0,0 0,1 1 p 0,2 1 p 0,3 1 p 0,4

1 p 1,0 1,1

1 p 1,2

1 p 1,3

1 p 1,4

p 1 p

p 1 p

p 1 p

p 1 p

2,0

2,1

2,2

2,3

2,4

1 p 3,0 3,1

1 p 3,2

1 p 3,3

1 p 3,4

4,0

4,1

4,2

4,3

Telecommunication Networks 2005

Вам также может понравиться