Вы находитесь на странице: 1из 14

1 INTRODUCTION 1

Markov Chain Models


1 Introduction
Let T [0, ) and let S be a countable set. We usually think of T as a set of
possible times and S as a set of possible states of a system. Let (, A, Pr) be
a probability model, and let {X(t), t T} be a set of random variables dened
on this model. We think of the event {X(t) = s} as the event that the process
is in state s at time t. We are interested in studying situations where it makes
sense to assume the following. For every nite sequence
t
0
< t
2
< . . . < t
n
< t
n+1
of elements of T and every nite sequence
s
0
< s
1
< . . . < s
n
< s
n+1
if Pr(X(t
n
) = s
n
, . . . , X(t
0
) = s
0
) > 0 then
Pr(X(t
n+1
) = s
n+1
| X(t
n
) = s
n
, . . . , X(t
0
) = s
0
)
= Pr(X(t
n+1
) = s
n+1
| X(t
n
) = s
n
). (1)
The relation (1) is a called the Markov Property named after A. A. Markov
(June 14, 1856 N.S. July 20, 1922) of Markov inequality fame.
2 Note on notation
If it does not result in double subscripts we will write X
t
in place of X(t) For
example, X
5
instead of X(5). Since X(t) is a random variable, it maps into
S. Technically, X : T S, but we we almost always take this for granted.
3 Interpretation
A good way to think about the Markov Property is that it says that the condi-
tional probabilty of X(t
n+1
only depends on the value of X(t) for the last time
t that you oberved the process. In what follows we give a number of examples.
4 Balls in urns
Suppose we have two urns, each containing 10 balls. Altogether there are 10
white balls and 10 black balls. Once a minute, a ball is selected at random from
each urn and placed into the opposite urn. We wish to keep track of the number
of balls of each color in each urn. This is a simplied model of gas dynamics,
where the colored balls represent two dierent types of gas molecules, and the
urns represent the left and right sides of a near-vacuum chamber.
4 BALLS IN URNS 2
To model this, paint one urn red, the other green, and observe that if we
know how many white balls are in the red urn, we know everything. Let X
n
be the number of white balls in the red urn after n switches. It makes sense to
assume that the Markov Property holds if we assume that each time we switch
the balls we are choosing the balls at random from their respective urns. This
might not be the case of the balls do not get mixed together each time there is
a switch. Note assuming that the selections are at random allow us to see that
for k {0, 1, . . . , 10}
Pr(X
n+1
= k 1 | X
n
= k) =
k
10

k
10
Pr(X
n+1
= k | X
n
= k) =
k
10

10 k
10
+
10 k
10

k
10
Pr(X
n+1
= k + 1 | X
n
= k) =
10 k
10

10 k
10
and if j k {0, 1}
c
then Pr(X
n+1
= j | X
n
= k) = 0. Assuming the Markov
Property allows us, in principle, to nd the mass function of X
n
for any positive
integer n if we know that mass function of X
0
. For example,
Pr(X
2
= c)
=
10

a=0
10

b=0
Pr(X
2
= c, X
1
= b, X
0
= a)
=
10

a=0
10

b=0
Pr(X
2
= c | X
1
= b, X
0
= a) Pr(X
1
= b, X
0
= a)
=
10

a=0
10

b=0
Pr(X
2
= c | X
1
= b) Pr(X
1
= b, X
0
= a)
=
10

a=0
10

b=0
Pr(X
2
= c | X
1
= b) Pr(X
1
= b | X
0
= a) Pr(X
0
= a)
We will see below how to write this more succinctly using matrix algebra.
4.1 Exercise
Suppose that each urn holds 15 balls, but there are now 20 black balls and 10
white balls. If X
n
is the number of white balls in the red urn, nd Pr(X
n+1
=
k | X
n
= j) for j, k {0, 1, . . . , 10}.
4.2 Exercise
Suppose each urn holds 2 balls and there are 2 white balls and 2 black balls. :et
X
n
be the number of white balls in the red urn. Make a 3 3 matrix P where
P
j,k
= Pr(X
n+1
= k | X
n
= j).
5 GAMBLERS RUIN 3
5 Gamblers Ruin
Fred and Ethel engage in a simple game of chance. At each play they toss a
coin. If it comes up heads, Ethel gives Fred one dollar. If it come up tails,
Fred gives Ethel one dollar. They play until one of them has no money. If the
coin tosses are mutually independent and the probability that the coin comes
up heads is p (0, 1):
Who winds up with all the money?
How long does the game go on?
Suppose that Fred starts out with F dollars and Ethel starts out with E dollars.
If let let E
n
denote how much money Ethel has after n plays and F
n
denotes
the amount of money Fred has, then E
n
+F
n
= E +F. We let the state space
be {0, 1, 2, . . . , E +F} and let T be the non-negative integers. We have
Pr(E
n+1
= 0 | E
n
= 0) = 1,
Pr(E
n+1
= E +F | E
n
= E +F) = 1,
and if x {1, . . . , E +F 1} then
Pr(E
n+1
= x + 1 | E
n
= x) = 1 p
Pr(E
n+1
= x 1 | E
n
= x) = p
5.1 Exercise
Suppose between them Fred and Ethel have 4 dollars. Make a 5 5 matrix P
where
P
j,k
= Pr(E
n+1
= k | E
n
= j).
5.2 Exercise
Let W be the event that eventually Ethel has all the money. Suppose that
between them Fred and Ethel have $10. Let f(n) = Pr(W | Ethel has n dollars).
Explain why
f(0) = 0;
f(10) = 1;
f(n) = pf(n 1) + (1 p)f(n + 1) if n {1, 2, . . . , 9}.
Now show that if f p = 1/2 then f(n) = n/10 is a solution of this problem.
6 GAMBLERS RUIN CONTINUED 4
6 Gamblers ruin continued
Suppose Fred and Ethel cannot agree on who should supply the coin. Fred
has a coin with probability p (0, 1) of heads and Ethel one with probability
q (0, 1) of heads. They agree to change the game as follows. If Freds coin
comes up heads and Ethels comes up tails, then Ethel gives Fred a dollar. If
Ethels coin comes up heads and Freds comes up tails, Fred gives Ethel a dollar.
If the coins match, no money changes hands. All other rules are the same, and
E
n
is Ethels fortune after n plays. We still have
Pr(E
n+1
= 0 | E
n
= 0) = 1,
Pr(E
n+1
= E +F | E
n
= E +F) = 1,
but now
Pr(E
n+1
= x + 1 | E
n
= x) = (1 p)q
Pr(E
n+1
= x | E
n
= x) = pq + (1 p)(1 q)
Pr(E
n+1
= x 1 | E
n
= x) = p(1 q)
6.1 Exercise
Suppose between them Fred and Ethel have 4 dollars. Make a 5 5 matrix P
where
P
j,k
= Pr(E
n+1
= k | E
n
= j).
7 A branching model
Let N stand for the positive integers. Suppose that {C
n,k
, n N, k N}
be a collection of mutually independent random variables, each taking values
in the non-negative integers. Assume that for each n, the random variables
C
n,1
, C
j,2
, . . . all have the same mass function. For any sequence x
k
we make
the convention that
0

k=1
x
k
:= 0.
Now, put X
0
= 1 and for each n 0,
X
n+1
=
X
n

k=1
C
n+1,k
.
For example,
X
1
=
X
0

k=1
C
1,k
= C
1,1
X
2
=
X
1

k=1
C
2,k
7 A BRANCHING MODEL 5
so that if X
1
= 3 then
X
2
= C
2,1
+C
2,2
+C
2,3
.
The idea is that we start with X
0
= 1 organism that reproduces asexually.
It has C
1
like organisms, giving us X
1
= C
1,1
organisms in generation 1. If
X
1
= 3 then each of these 3 organisms has ospring, number C
2,1
, C
2,2
and
C
2,3
respectively, giving X
2
= C
2,1
+ C
2,2
+ C
2,3
organisms in generation 2
and so on. This was proposed as a model of male line of descent in families by
Galton and Watson in 1874 in a paper entitled On the probability of extinction
of families.
We can easily compute E[X
n
] provided E[C
n,k
] = c < . For simplicity
of the argument, suppose that Pr(C
n,k
m) = 1 for some m > 0. Then
Pr(X
n
m
n
) = 1. Next, for any real number x, dene I
x
: (, ) {0, 1}
by I
x
(x) = 1 and I
x
(u) = 0 if u = x. Then
m
n

N=0
I
N
(X
n
) = 1.
Therefore
E[X
n+1
] = E
_
X
n+1
m
n

N=0
I
N
(X
n
)
_
=
m
n

N=0
E[X
n+1
I
N
(X
n
)] .
Now, observe that
X
n+1
I
N
(X
n
) =
_
X
n

k=1
C
n+1,k
_
I
N
(X
n
) =
_
N

k=1
C
n+1,k
_
I
N
(X
n
)
which expresses the lefthand side as the product of independent random vari-
ables. Therefore
E[X
n+1
I
N
(X
n
)] = E
__
N

k=1
C
n+1,k
_
I
N
(X
n
)
_
= E
__
N

k=1
C
n+1,k
__
E[I
N
(X
n
)]
=
_
N

k=1
E[C
n+1,k
]
_
Pr(X
n
= N)
= Nc Pr(X
n
= N).
This shows us that
E[X
n+1
] =
m
n

N=0
Nc Pr(X
n
= N) = cE[X
n
]
8 A QUEUING MODEL 6
since Pr(X
n
m
n
) = 1. Therefore, by iteration, E[X
n+1
] = c
n+1
. Since
E[X
0
] = 1 we have shown that
E[X
n
] = c
n
.
Hence, from Markovs inequality,
Pr(X
n
= 0) = Pr(X
n
1) E[X
n
] = m
n
.
In particular, if m < 1 then Pr(X
n
= 0) 0 as n . This makes sense, as if
we are on average having fewer than one ospring per parent, we would expect
the family to die out. It remains to be seen what happens if c 1.
7.1 Exercise
Let B
n
= {X
n
= 0}. Explain why B
n+1
B
n
and why the sequence
Pr(X
n
= 0) is convergent.
8 A queuing model
Let T and S be the non-negative integers. Suppose that the states indicate the
number of customers in a bank. We assume that once a minute one of three
things happens:
A customer leaves the bank,
A customer enters the bank,
No one enters nor leaves the bank.
Let C
n
be the number of customers at minute n, and for each non-negative
integer k suppose that b
k
+ r
k
+ d
k
= 1, b
k
and d
k
are positive while r
k
is
non-negative. Assume
Pr(C
n+1
= k + 1 | C
n
= k) = b
k
Pr(C
n+1
= k | C
n
= k) = r
k
Pr(C
n+1
= k 1 | C
n
= k 1) = d
k
We might assume, for example that b
k
= b for all k. Can we compute E[C
n
] or
Var[C
n
].
Such a process is also called a birth and death process.
9 Queuing model continued
It is unrealistic to suppose that the bank could hold an unlimited number of
people, so we might assume that for some N > 0, b
k
= 0 for all k N. In
biological models, N would be the carrying capacity of the ecosystem.
10 THE PURE DEATH MODEL 7
10 The pure death model
Modify the queuing model to have b
k
= 0 for all k. By choosing suitable values
for d
k
we can model radioactive decay. What we need is that d
k
should be
proportional to k and the initial state would be the total number of atoms of
the decaying element present at the start of our observations.
11 A catastropic failure model
Imagine we are walking along the non-negative integers. At each integer there
is a coin, and the coin at n comes up heads with probability p
n
(0, 1). When
we are at any integer, we toss the coin we nd there. If it comes up heads, we
go to the next bigger integer. If it comes up tails, we go to 0. If we are at 0 we
just stay there. If we model this as a Markov chain, we have
Pr(X
n+1
= 1|X
n
= 0) = p
0
Pr(X
n+1
= 0|X
n
= 0) = 1 p
0
Pr(X
n+1
= x + 1|X
n
= x) = p
x
if x > 0
Pr(X
n+1
= 0|X
n
= x) = 1 p
x
if x > 0
In this set-up, one question is whether there is a positive probability that if we
start at 0, we never return there. Notice that if Pr(X
0
= 0) = 1, then
Pr(X
0
= 0, X
1
= 0)
= Pr(X
1
= 0 | X
0
= 0) Pr(X
0
= 0) = 1 p
0
,
Pr(X
0
= 0, X
1
= 0, X
2
= 0)
= Pr(X
0
= 0, X
1
= 1, X
2
= 0)
= Pr(X
2
= 0 | X
1
= 1, X
0
= 0) Pr(X
1
= 1 | X
0
= 0) Pr(X
0
= 0)
Use the Markov property
= Pr(X
2
= 0 | X
1
= 1) Pr(X
1
= 1 | X
0
= 0) Pr(X
0
= 0)
= (1 p
1
)p
0
,
Pr(X
0
= 0, X
1
= 0, X
2
== 0, X
3
= 0)
= Pr(X
0
= 0, X
1
= 1, X
2
= 2, X
3
= 0)
= Pr(X
3
= 0 | X
2
= 2, X
1
= 1, X
0
= 0) Pr(X
2
= 2 | X
1
= 1, X
0
= 0)
Pr(X
1
= 1 | X
0
= 0) Pr(X
0
= 0)
Use the Markov property
= Pr(X
3
= 0 | X
2
= 2) Pr(X
2
= 2 | X
1
= 1) Pr(X
1
= 1 | X
0
= 0) Pr(X
0
= 0)
= (1 p
2
)p
1
p
0
We can see in this fashion that for n 2,
Pr(X
0
= 0, X
1
= 0, . . . , X
n1
= 0, X
n
= 0) = (1 p
n1
)p
n2
p
0
.
12 RANDOM WALKS 8
12 Random walks
Suppose that Y
1
, Y
2
, . . . are mutually independent random variables taking val-
ues in the integers. Let T be the non-negative integers, let S be the integers.
Put X
0
= 0 and X
n+1
= X
n
+Y
n+1
for n = {0, 1, 2, . . .}. This means that
X
n+1
=
n+1

k=1
Y
k
Suppose that Pr(X(t
n
) = s
n
, . . . , X(t
0
) = s
0
) > 0. We have
X(t
n+1
) X(t
n
) =
t
n+1

k=t
n
+1
Y
k
so X(t
n+1
) X(t
n
) is independent of X(t
n
). Adopting the notation from (1),
if
Pr(X(t
n
) = s
n
, . . . , X(t
0
) = s
0
) > 0 then
Pr(X(t
n+1
) = s
n+1
| X(t
n
) = s
n
, . . . , X(t
0
) = s
0
)
=
Pr(X(t
n+1
) = s
n+1
, X(t
n
) = s
n
, . . . , X(t
0
) = s
0
)
Pr(X(t
n
) = s
n
, . . . , X(t
0
) = s
0
)
=
Pr(X(t
n+1
) X(t
n
) = s
n+1
s
n
, X(t
n
) = s
n
, . . . , X(t
0
) = s
0
)
Pr(X(t
n
) = s
n
, . . . , X(t
0
) = s
0
)
=
Pr(X(t
n+1
) X(t
n
) = s
n+1
s
n
) Pr(X(t
n
) = s
n
, . . . , X(t
0
) = s
0
)
Pr(X(t
n
) = s
n
, . . . , X(t
0
) = s
0
)
= Pr(X(t
n+1
) X(t
n
) = s
n+1
s
n
)
=
Pr(X(t
n+1
) X(t
n
) = s
n+1
s
n
) Pr(X(t
n
) = s
n
)
Pr(X(t
n
) = s
n
)
=
Pr(X(t
n+1
) X(t
n
) = s
n+1
s
n
, X(t
n
) = s
n
)
Pr(X(t
n
) = s
n
)
=
Pr(X(t
n+1
) = s
n+1
, X(t
n
) = s
n
)
Pr(X(t
n
) = s
n
)
= Pr(X(t
n+1
) = s
n+1
| X(t
n
) = s
n
)
If the Y
k
all have the same distribution then the random walk is said to be sta-
tionary, meaning that the probability distribution of the increment X(t
n+1
)
X(t
n
) depends only on t
n+1
t
n
. Notice that increments of a random walk are
independent if they occur over time intervals that do not overlap. We will have
a lot to say about more general processes with this behaviour.
12.1 Exercise
Suppose that Pr(Y
n
= 1) = p (0, 1) and Pr(Y
n
= 1) = 1 p. Find an
expression for Pr(X
2n
= 0) for any positive integer n.
13 THE POISSON PROCESS 9
12.2 Exercise
Explain why Pr(X
2n1
= 0) = 0 for every positive integer n.
12.3 Exercise
Let I
n
= 1 if X
n
= 0 and I
n
= 0 if X
n
= 0. Intepret I
2
+I
4
+ +I
2n
in terms
of the number of times our random walk visits 0.
12.4 Exercise
Show that
lim
N
E
_
N

n=1
I
2n
_
<
if p = 1/2.
13 The Poisson process
Suppose that we would like to model the arrival of customers in such a way that
time is measured continuously rather than discretely. If so, we will need that
the time between arrivals has no memory, that is, just because you know how
long it has been since the last arrival, you have no information about about
how much longer you will have to wait. In other words, if T is the time for
the next arrival we need Pr(T > s + t|T > t) = Pr(T > s). This is called the
memoryless property and it forces T to have an exponential distribution if
the distribution function of T is to be continuous.
So, suppose that T
1
, T
2
, . . . is a sequence of mutually independent positive
random variables such that for any t 0,
Pr(T
n
> t) = exp(t).
For any t 0, dene N
t
as follows. Dene
G
0
= 0
G
n
= G
n1
+T
n
if n {1, 2, . . .}
= T
1
+ +T
n
.
We think of G
n
as the time of the arrival of the n
th
customer. It is known that
for n 1 and t > 0,
Pr(G
n
> t) =
_

t
1
(n 1)!
u
n1
exp(u) du =
n1

k=0
t
k
k!
exp(t).
With this in mind, for t 0 we dene
N
t
= max{n : G
n
t}.
14 A COMPOUND POISSON MODEL FOR INSURANCE CLAIMS 10
Since Pr(G
n
> 0) = 1 if n > 0 we have Pr(N
0
= 0) = 1. If T
1
> t then
N
t
= 0. Furthermore, since G
n
is to represent the time of the arrival of the n
th
customer, we should have for any k {1, 2, . . .}
{N
t
k 1} = {G
k
> t}.
This is indeed the case. Remember that G
0
< G
1
< G
2
. . .. If n is a positive
integer, then
{N
t
= n} = {G
n
t} {G
n+1
> t}
so for each positive integer k,
{N
t
k 1} =
k1
_
n=0
{N
t
= n} =
k1
_
n=0
({G
n
t} {G
n+1
> t}) = {G
k
> t}.
This last equality is proven by induction on k. When k = 1, it follows from the
denition of n and the G
k
that
{G
0
t} {G
1
> t} = {G
1
> t}
since G
0
= 0. Now suppose the equality holds for k = N 1. Then for
k = N + 1
N
_
n=0
({G
n
t} {G
n+1
> t})
= ({G
N
t} {G
N+1
> t})
N1
_
n=0
({G
n
t} {G
n+1
> t})
= ({G
N
t} {G
N+1
> t}) {G
N
> t}
= ({G
N
t} {G
N+1
> t}) ({G
N
> t} {G
N+1
> t})
since G
N
< G
N+1
= {G
N+1
> t}
as claimed.
It is technically complicated to show that {N
t
, t 0} has the Markov prop-
erty, and we will approach the Poisson process from a dierent angle.
14 A compound Poisson model for insurance
claims
Suppose that Y
1
, Y
2
, . . . and C
1
, C
2
, . . . be mutually independent random vari-
ables. Suppose further that the Y
k
s are identically distributed Poisson random
variables with expected value > 0 and the C
j
s are identically distributed
15 STATIONARY, INDEPENDENT INCREMENT PROCESSES 11
positive integer valued random variables. Put
N
n
=
n

k=1
Y
k
X
n
=
N
n

j=1
C
j
.
N
n
represents the number of insurance claims led on the n
th
business day at
an insurance company, and let C
j
be the amount due the insured for the j
th
claim. X
n
then represents the amount the insurance company must pay out by
the end of the n
th
business day. Both {N
n
, n {0, 1, . . .}} and {(X
n
, N
n
), n
{0, 1, . . .}} are Markov chains, but it is not clear whether {X
n
, n {0, 1, . . .}}
alone is a Markov chain.
15 Stationary, Independent Increment Processes
Markov processes can also be approached from an axiomatic approach. This
can be tricky as we have to show that the axioms can actually be satised. One
important class of examples are called stationary, independent increments
processes. We hope to exhibit a family of random variables, {X
t
: t 0} such
that
Pr(X
0
= 0) = 1;
If 0t
0
t
1
t
2
t
n
then the random variables X(t
1
) X(t
0
),
X(t
2
) X(t
1
), , X(t
n
) X(t
n1
) are mutally independent;
If 0 s t then X(t) X(s) and X(t s) have the same distribution
function.
Some examples are
Poisson Process: For t > 0 and k {0, 1, 2, . . .},
Pr(X(t) = k) =
t
k
k!
exp(t)
Brownian Motion: For t > 0 and x (, ),
Pr(X(t) x) =
1

2t
_
x

exp
_

u
2
2t
_
du
If A > 0, B > 0, C is a real number, and X
t
is Brownian motion then
S(t) = Aexp(BX
t
+Ct)
is a simple model of the evolution of the price of a stock and is the basis
for the Black-Scholes price for a stock option.
16 A MARKOV CHAIN WITH THREE STATES 12
Gamma Process: For t > 0 and x > 0,
Pr(X(t) > x) =
1
(t)
_

x
u
t1
exp(u) du.
Negative Binomial Process: For t > 0 and k {0, 1, 2, . . .}
Pr(X(t) = k) =
_
t 1 +k
k
_
(1 p)
t
p
k
.
Compound Poisson Process: Suppose that {N
t
: t 0} is a Poisson process
and Y
1
, Y
2
, . . . are mutually independent identically distributed random
variables which are also independent of the Poisson process. Then
X(t) =
N
t

k=1
Y
k
is also a stationary, independent increments process. Such processes are
used to model insurance claims. The Gamma process is a special case of
this where
Pr(Y
k
= n) =
p
n
nlog(1 p)
, n {1, 2, . . .}.
16 A Markov chain with three states
If we completely understand the following example we will understand the basic
properties of every stationary Markov chain that has a nite state space. Since
the states are just labels, let the state space be {1, 2, 3}.
Given any sequence of states, s
0
, s
1
, . . . the Markov property tells us that
Pr(X
0
= s
0
, X
1
= s
1
, . . . , X
n1
= s
n1
, X
n
= s
n
)
= Pr(X
0
= s
0
) Pr(X
1
= s
1
| X
0
= s
0
) Pr(X
n
= s
n
| X
n1
= s
n1
).
so the behaviour of this chain is determined by the mass function of X
0
and
the conditional probabilities Pr(X
1
= k|X
0
= j) for j and k in {1, 2, 3}. Let us
suppose for the moment that Pr(X
0
= j) > 0 for any state j.
Let
n
= [
n
(1),
n
(2),
n
(3)] where
n
(j) = Pr(X(n) = j) and P be the
3 3 matrix with P(j, k) = Pr(X
1
= k | X
0
= j).
Note then that

1
(k) = Pr(X
1
= k)
=
3

x=1
Pr(X
1
= k, X
0
= x)
=
3

x=1
Pr(X
1
= k|X
0
= x) Pr(X
0
= x)
=
3

x=1

0
(x)P(x, k)
16 A MARKOV CHAIN WITH THREE STATES 13
which we recognize in matrix form as
[
1
(1),
1
(2),
1
(3)] = [
0
(1),
0
(2),
0
(3)]
_
_
P(1, 1) P(1, 2) P(1, 3)
P(2, 1) P(2, 2) P(2, 3)
P(3, 1) P(3, 2) P(1, 3)
_
_
In fact, for any non-negative integer,

n+1
(k) = Pr(X
n+1
= k)
=
3

x=1
Pr(X
n+1
= k, X
n
= x)
=
3

x=1
Pr(X
n+1
= k|X
n
= x) Pr(X
n
= x)
=
3

x=1

n
(x)P(x, k)
so
[
n+1
(1),
n+1
(2),
n+1
(3)]
= [
n
(1),
n
(2),
n
(3)]
_
_
P(1, 1) P(1, 2) P(1, 3)
P(2, 1) P(2, 2) P(2, 3)
P(3, 1) P(3, 2) P(1, 3)
_
_
More succinctly,

n+1
=
n
P
By iterating,

n
=
0
P
n
This shows us that we can understand the long time behaviour of X
n
if we
understand P
n
. For the purposes of rest of this example, we will take
P =
_
_
1/3 1/2 1/6
1/3 1/3 1/3
1/4 1/2 1/4
_
_
.
P has some interesting properties. Let us suppose The most important of
these are
The elements of P are all between 0 and 1.
The rows of P each sum to 1:
P(j, 1) +P(j, 2) +P(j, 3)
= Pr(X
1
= 1 | X
0
= j) + Pr(X
1
= 2 | X
0
= j) + Pr(X
1
= 3 | X
0
= j)
=
Pr(X
1
= 1, X
0
= j) + Pr(X
1
= 2, X
0
= j) + Pr(X
1
= 3, X
0
= j)
Pr(X
0
= j)
16 A MARKOV CHAIN WITH THREE STATES 14
=
Pr(X
0
= j)
Pr(X
0
= j)
Since the state space is {1, 2, 3}
= 1
Now, since the rows of P sum to 1, the column vector [1, 1, 1]
t
is a right
eigenvector for P with eigenvalue 1. This means that P has a left eigen-
vector for the eigenvalue 1 as well. That means there is a row vector
whose entries sum to 1. By direct calculation we can see that in fact the
entries of are non-negative as well, so if we choose
0
= then
n
=
for all n, that is, the X
n
all have the same distribution!

Вам также может понравиться