Академический Документы
Профессиональный Документы
Культура Документы
Fall06
This problem set essentially reviews Markov chains and Poisson Processes. Not all exercises are to be turned in. Only those with the sign F are due on Thursday, November
16th at the beginning of the class. Although the remaining exercises are not graded, you are
encouraged to go through them.
We will discuss some of the exercises during discussion sections.
Please feel free to point out errors and notions that need to be clarified.
Exercise 7.1. F
Consider the Markov chain Xn with the transition diagram shown in figure 7.1 where n > 1
1p
.
p
EE226
Fall06
1
,
M +1
j = 0, 1, . . . , M
Solution
We know that an irreducible and aperiodic Markov chain has a limiting probability which
is the unique solution of = P . Thus we only need to check that the distribution given in
this exercise is a solution.
Letting j = M1+1 , j = 0, 1 . . . , M we have
X
(P )j =
i Pij
i
1 X
Pij
M +1 i
1
=
M +1
= j
which gives the results.
Exercise 7.3. Let j denote the long-run proportion of time a given Markov chain is in
state j.
(a) Explain why j is also the proportion of transitions that are into the state j as well as
being the proportion of transitions that are from state j.
(b) P
Pij represents the proportion of transitions that satisfy what property?
(c) i Pij represents the proportion of transitions that satisfy what property?
(d) Using the preceding explain why
X
j =
i Pij
i
7-2
EE226
Fall06
Solution: Hint
(a) The number of transitions into i, the number of transitions from i, and the the number of
time period the process is in i are all different by at most 1, thus their long-run proportions
are equal.
(b)
Pi Pij is the long-run proportion of transitions from i to j.
(c) i i Pij is the long-run proportion of transitions into j.
P
(d) Since j is equal to the long-run proportion of transition into j we must havej i i Pij
Exercise 7.4. F
This was a prelim question!
A fly wanders around on a graph G with vertices V = {1, . . . , 5} (see figure 7.2) (a) Suppose
3
spider
window
5
that the fly wanders as follows: if it is at node i at time t, then it chooses one of the neighbors
j of i uniformly at random, and then wanders to node j at time t + 1. For times t = 1, 2, . . . ,
let Xt = {flys position at time t}. Is Xt a Markov chain? Give your arguments.
(b) Does the probability vector P r(Xt = i) converge to a limit as the time t increases,
independent of the starting position?
(c) Now suppose that a spider is sitting at node 3 and will catch the the fly whenever it
reaches that node. On the other hand, if the fly reaches the window (node 5), it will escape.
What is the probability that the fly escapes supposing it starts at node 1?
Solution
(a) The position of the fly in the next time step only depends on the current position, so
the process is a Markov chain. It is irreducible, finite (thus positive recurrent), and periodic
with period 2.
(b) Starting at position 1, we know that the fly will return to that position only after an
even number of time steps. Thus P (Xt = 1|X0 = 1) = 0 for all t odd. In the other hand
if the initial position is 2, the fly will enter state 1 only after an odd number of time steps.
Since the process is positive recurrent we have P (Xt = 1|X0 = 2) > 0 for all t odd. Thus
depending on the starting point we have two different limits.
(c)We want to compute P (T5 < T3 |X0 = 1) where Ti = min{t 0, Xt = i}.
7-3
EE226
Fall06
Conditioning on the first step and using the fact that the process is a Markov chain, we have:
1
1
P (T5 < T3 |X0 = 2) + P (T5 < T3 |X0 = 4)
2
2
1 1
1
1
1
=
1 + P (T5 < T3 |X0 = 1) + P (T5 < T3 |X0 = 1) + 0
2 3
3
2
2
1
5
=
+ P (T5 < T3 |X0 = 1)
6 12
2
7
= P1
= (P2 P1 ) = 2 P1
= (P3 P2 ) = 3 P1
= m1 P1
1 i
and Pi =
1 m
Exercise 7.6. F
An individual possesses r umbrellas which he employs in going from home to office and vice
versa. If he is at home (resp. office) at the beginning (resp. end) of a day and it is raining,
then he will take an umbrella with him to the office (resp. home) if there is one to be taken.
7-4
EE226
Fall06
If it is not raining, then he will not take an umbrella. Assuming that, independent of the
past, it rains at the beginning (end) of a day with probability p.
(a) Define a Markov chain that will help you determine the proportion of time that our man
gets wet.
(b) Show that there exists a limiting probability and it is given by
(
1p
if i = 0
r+1p
j =
1
if i = 1, . . . , r
r+1p
(c) What fraction of time does the man gets wet?
(d) Given r = 3, what value of p maximizes the fraction of time the man gets wet.
Solution
(a) Let Xn be the number of umbrellas at home at the beginning of day n. Given that
number, Xn+1 depends only on whether it rains or not on that day and this is independent
to everything else. Thus Xn is a Markov chain. The transition probabilities are given by:
1p
if j = 0 and i = j
p
if j = 0 and i = 1
p(1 p)
if j = 1, . . . , r 1 and i {j 1, j + 1}
P (Xn+1 = i|Xn = j) =
1 2p(1 p) if j = 1, . . . , r 1 and i = j
p(1
p)
if j = r and i = r 1
1 p(1 p) if j = r and i = r
(b) The Markov chain is finite, irreducible, and aperiodic. It is then positive recurrent and
admits a unique invariant distribution which is the solution of the balance equations
j = (1 2p(1 p))j + p(1 p)j+1 + p(1 p)j1 ,
j = 1, . . . , r 1
j = 1, . . . , r 1
For r we have
r = p(1 p)r1 + (1 p(1 p))r r = r1
Using the value of r1 in the balance equations we get
r1 = 12 [r2 + r ] = 12 [r2 + r1 ]
r1 = r2
Iterating with respect to r we get:
r = r1 = = 1
7-5
EE226
Fall06
Thus
1 = 2 = = r =
1
1p+r
And
1p
1p+r
(c) Our man gets wet whenever there is no umbrella at home (w.p 0 ) and it rains in the
morning (w.p p), and whenever it rains in the afternoon (w.p p) and all the umbrellas are
at home (i.e. it did not rain in the morning) (w.p (1 p)r ). Thus
0 = (1 p)1 =
2p(1 p)
= P r(p)
1p+r
(d) For r = 3, we maximize this probability by taking the derivative and letting it equal to
zero.
p2 8p + 4
P r(p) =
=
0
p
=
4
2
3
p
(4 p)2
Exercise 7.7. F
Let X be random variable with cdf FX (x) and pdf fX (x). We define the rate function rX (t)
of X as:
fX (t)
rX (t) =
1 FX (t)
(a) Show that rX (t) uniquely determines the distribution of X.
Hint: Compute P {X (t, t + )|X > t}.
(b) What is rX (t) when X Expo()?
Solution:
(a) We will show that the distribution of X is a function of the rate function.
Note that
P (X {t, t + dt}, X > t)
P (X > t)
fx (t)dt
= rX (t)dt
=
1 Fx (t)
dFX (t)
=
= rX (t)dt
1 Fx (t)
7-6
EE226
Fall06
FX (s) = 1 exp
rX (t)dt
0
(b) For an exponential random variable, it is easy to check that the rate function is constant
and equal to the rate
Exercise 7.8. F
Let X1 and X2 be independent exponential random variables, each having rate . Let
X(1) = min{X1 , X2 } and X(2) = max{X1 , X2 }
Compute:
(a)E[X(1) ]
(b)V ar[X(1) ]
(c)E[X(2) ]
(d)V ar[X(2) ]
Solution:Hint
These quantities can be computed using results given in class and in the notes:
1
(a)E[X(1) ] = 2
(b)V ar[X(1) ] = 41 2
3
(c)E[X(2) ] = 2
(d)V ar[X(2) ] = 45 2
Exercise 7.9. F
Suppose that you are the n0 th student arriving in front of the GSIs office and somebody is
already inside asking a question (thus there are n + 1 students). Whenever the GSI finishes
his discussion with one student, the next in the line enters the office and ask another question.
We assume that the time it takes to complete a question is exponential with mean and
independent to everything else. Since its a hot day, students will wait in the line only for
an exponential random time with mean independently to other students (if they are not
served after that time, they leave!).
We assume that the office hour is long enough for this problem (maybe infinite!).
(a) Find the probability Pn that you eventually get inside the office.
(b) Find the conditional expected amount of time Wn you spend waiting in the line given
that you eventually get inside the office.
7-7
EE226
Fall06
Solution:
The key point in this exercise is to condition on the first event after the students arrival and
to use recursion. Note that we have n + 1 independent exponential. To simplify the notation
we let 0 = 1/ and 0 = 1/ the rates of the exponentials.
(a) If the first event corresponds to the expiration of the exponential time of the n0 th student,
then the desired probability is equal to zero. Else (and this happens with probability ((n
1)0 + 0 )/(n0 + 0 )), because of the memoryless property of the exponential distribution,
the system re-starts after the first departure and our student is the (n 1)th in the new
system. Thus we have:
(n 1)0 + 0
Pn =
Pn1
(n0 + 0 )
Using the above result with n 1 replacing n gives
(n 1)0 + 0 (n 2)0 + 0
Pn2
(n0 + 0 ) (n 1)0 + 0
(n 2)0 + 0
=
Pn2
n0 + 0
Pn =
0 + 0
0
P
=
1
n0 + 0
n0 + 0
because P1 = 0 /(0 + 0 ) (the probability that the student inside the office leaves first).
(b) Note that the time until the n0 th student gets inside the office is equal to the minimum
of the n + 1 exponential random variable plus some additional time. But the minimum is
exponential and has mean 1/(n0 + 0 ). Since all the random variables are memoryless we
have (considering the time of the first departure)
Wn =
n0
1
+ Wn1
+ 0
n
X
i=1
1
i0 + 0
EE226
Fall06
g(n, t + ) g(n, t)
= g(n, t) + g(n 1, t)
Letting 0 we have
g(n, t)
= g(n, t) + g(n 1, t)
t
This differential equation can be solve recursively.
First for n = 0 we have
g(0, t)
= g(0, t)
t
because the probability that there is an event at 1 is zero. Hence solving this equation
leads to
g(0, t) = Aet
Since this is a probability distribution, we know that it is equal to 1 for t = 0, thus A = 1.
Assuming that g(n 1, t) is the probability distribution of a Poisson random variable with
rate t, one can easily show that
g(n, t) =
(t)n t
e
n!
7-9
EE226
Fall06
(b) If the isolated jumps property is removed, the argument above is no longer valid. Also
we know that the isolated jumps is necessary (or needed) for a Poisson process.
Exercise 7.11. F
In class (also in the lecture notes) we have defined the Poisson Process starting from iid
exponential random variables.
Here is another definition:
The counting process {N (t), t 0} is said to be a Poisson Process having rate if:
i. N (0) = 0
ii. The process has independent increment
iii. The number of events in any interval of length t is Poisson distributed with mean t
Show that the last definition and the one given in class are equivalent.
Solution: Hint
First note that, as we have seen in class, any Poisson process has the 3 properties. Now we
want to show that any process with the 3 properties is a Poisson process (in the sense of the
definition given in class).
We essentially have to show that a counting process that has the 3 properties is driven by
iid exponentials.
The independence is given by property (ii).
Since N (0) = 0, the probability that the first event occurs after t time units is
P r(P oisson(t) = 0) = et
which implies that the arrival time of the first event is exponential with rate .
Given that the first event occurred at time T1 , the probability that the second event occurs
after T1 + t is also et (using (ii) and (iii)) independently of the actual value of T1 .
Repeating this scheme, we see that the inter-arrival times are iid exponential random variables with rate .
7-10