Вы находитесь на странице: 1из 37

Markov Processes

and
Birth-Death Processes

J. M. Akinpelu
Exponential Distribution
Definition. A continuous random variable X has an
exponential distribution with parameter > 0 if its
probability density function is given by



Its distribution function is given by

<
>
=

. 0
0
, 0
,
) (
x
x
e
x f
x

<
>

=

. 0
0
, 0
, 1
) (
x
x
e
x F
x
Exponential Distribution
Theorem 1. A continuous R.V. X is exponentially
distributed if and only if for


or equivalently,


A random variable with this property is said to be
memoryless.
, 0 , > t s
) 1 ( } { } | { s X P t X t s X P > = > + >
}. { } { } { t X P s X P t s X P > > = + >
Exponential Distribution
Proof: If X is exponentially distributed, (1) follows readily.
Now assume (1). Define F(x) = P{X x}, f (x) = F'(x),
and, G(x) = P{X > x}. It follows that G'(x) = f (x). Now
fix x. For h > 0,


This implies that, taking the derivative wrt x,



. ) ( ) ( ) ( h G x G h x G = +
. ) (
) ( ) (
) ( ) (
) (
) (
) (
) (
dx x f
h G x f
h G x G
h G
h x dG
dx
h x dG
dx
h x dG
=
=
'
=
+
+
+
Exponential Distribution
Letting x = 0 and integrating both sides from 0 to t
gives


. ) (
) 0 ( )) ( log(
) 0 ( )) ( log(
) 0 (
) 0 (
) 0 (
0 0
0 0
) (
) (
) (
) (
t f
t t
t t
h G
h dG
h G
h dG
e t G
t f t G
x f h G
dx f
dx f

=
=
=
=
=
} }
Exponential Distribution
Theorem 2. A R.V. X is exponentially distributed
if and only if for h > 0,



) ( } { h o h h X P + = s
. ) ( 1 } { h o h h X P + = >
Exponential Distribution
Proof: Let X be exponentially distributed, then
for h > 0,





The converse is left as an exercise.



). (
!
) (
!
) (
1 1
1 } {
2
1
h o h
n
h
h
n
h
e h X P
n
n
n
n
h
+ =

+ =
|
|
.
|

\
|

+ =
= s

Exponential Distribution
0
0.2
0.4
0.6
0.8
1
1.2
0 1 2 3 4 5 6
x
F
(
x
)
slope (rate)
Markov Process
A continuous time stochastic process {X
t
, t > 0}
with state space E is called a Markov process provided
that



for all states i, j e E and all s, t > 0.
} | (
} 0 , , | {
i X j X P
s u x X i X j X P
s t s
u u s t s
= = =
< s = = =
+
+
s+t s 0
known
Markov Process
We restrict ourselves to Markov processes for which the
state space E = {0, 1, 2, }, and such that the
conditional probabilities



are independent of s. Such a Markov process is called
time-homogeneous.
P
ij
(t) is called the transition function of the Markov
process X.
} | { ) ( i X j X P t P
s t s ij
= = =
+
Markov Process - Example
Let X be a Markov process with





where



for some > 0. X is a Poisson process.
|
|
|
|
|
|
.
|

\
|
=

) (
) ( ) (
) ( ) ( ) (
0
1 0
2 1 0
) (
t r
t r t r
t r t r t r
t P
j i
i j
t e
t r t P
i j t
i j ij
s s

= =

0 ,
)! (
) (
) ( ) (

0
Chapman-Kolmogorov Equations
Theorem 3. For i, j e E, t, s > 0,


. ) ( ) ( ) (

e
= +
E k
kj ik ij
t P s P t s P
Realization of a Markov Process
t
X
t
(e)
T
1
T
5
T
3
T
4
T
2
T
0
1
3
4
5
6
7
0
2
S
0
S
1
S
3
S
2
S
4
S
5
Time Spent in a State
Theorem 4. Let t > 0, and n satisfy T
n
t < T
n+1
, and let W
t
=
T
n+1
t. Let i e E, u > 0, and define

Then

Note: This implies that the distribution of time remaining in a
state is exponentially distributed, regardless of the time
already spent in that state.

. } | { ) ( i X u W P u G
t t
= > =
). ( ) ( ) ( v G u G v u G = +
T
n+1
t
T
n
W
t
t+u
Time Spent in a State
Proof: We first note that due to the time homogeneity of X, G(u)
is independent of t. If we fix i, then we have
). ( ) (
} | { } | {
} , | { } | {
} | , {
} | { ) (
v G u G
i X v W P i X u W P
i X u W v W P i X u W P
i X v W u W P
i X v u W P v u G
u t u t t t
t t u t t t
t u t t
t t
=
= > = > =
= > > = > =
= > > =
= + > = +
+ +
+
+
An Alternative Characterization of a
Markov Process
Theorem 5. Let X ={X
t
, t > 0} be a Markov process. Let T
0
, T
1
,
, be the successive state transition times and let S
0
, S
1
, , be
the successive states visited by X. There exists some number v
i

such that for any non-negative integer n, for any j e E, and t > 0,




where


t
n n n n n n
i
e j i Q
T T i S S S t T T j S P
v
+ +
=
= > =
) , (
} , , ; , , , | , {
0 1 0 1 1

. 1 , 0 , 0 = = >

eE j
ij ii ij
Q Q Q
An Alternative Characterization of a
Markov Process
This implies that the successive states visited by
a Markov process form a Markov chain with
transition matrix Q.

A Markov process is irreducible recurrent if its
underlying Markov chain is irreducible recurrent.
Kolmogorov Equations
Theorem 6.



and, under suitable regularity conditions,




These are Kolmogorovs Backward and Forward
Equations.


) ( ) ( ) ( t P t P Q t P
ij i
i k
kj ik i ij
v v =
'

=
. ) ( ) ( ) ( t P t P Q t P
ij j
j k
ik kj k ij
v v =
'

=
Kolmogorov Equations
Proof (Forward Equation): For t, h > 0,




Hence




Taking the limit as h 0, we get our result.





( ) . ) ( 1 ) ( ) ( ) ( h o h v t P Q h t P h t P
j ij
j k
kj k ik ij
+ + = +

=
v
.
) (
) ( ) (
) ( ) (
h
h o
t P t P Q
h
t P h t P
ij j
j k
ik kj k
ij ij
+ =
+

=
v v
Limiting Probabilities
Theorem 7. If a Markov process is irreducible recurrent, then
limiting probabilities


exist independent of i, and satisfy


for all j. These are referred to as balance equations. Together
with the condition

they uniquely determine the limiting distribution.
) ( lim t P P
ij
t
j

=

=
=
j k
k kj k j j
P Q P v v
, 1 =

j
j
P
Birth-Death Processes
Definition. A birth-death process {X(t), t > 0} is a Markov
process such that, if the process is in state j, then the only
transitions allowed are to state j + 1 or to state j 1 (if j > 0).

It follows that there exist non-negative values
j
and
j
,
j = 0, 1, 2, , (called the birth rates and death rates) so that,
. 1 | | ) ( } | {
) ( 1 } | {
) ( } 1 | {
) ( } 1 | {
1
1
> = = =
+ = = =
+ = + = =
+ = = =
+
+
+ +
+
i j if h o i X j X P
h o h h j X j X P
h o h j X j X P
h o h j X j X P
t h t
j j t h t
j t h t
j t h t

Birth and Death Rates


j-1
j+1 j

j-1

j+1

j
Note:
1. The expected time in state j before entering state j+1 is 1/
j
;
the expected time in state j before entering state j1 is 1/
j
.
2. The rate corresponding to state j is v
j
=
j
+
j
.
Differential-Difference Equations
for a Birth-Death Process
It follows that, if , then





Together with the state distribution at time 0, this
completely describes the behavior of the birth-
death process.
} ) ( { ) ( j t X P t P
j
= =
). ( ) ( ) (
0 ), ( ) ( ) ( ) ( ) (
0 0 1 1 0
1 1 1 1
t P t P t P
dt
d
j t P t P t P t P
dt
d
j j j j j j j j


=
> + + =
+ +
Birth-Death Processes - Example
Pure birth process with constant birth rate

j
= > 0,
j
= 0 for all j. Assume that




Then solving the difference-differential equations for this
process gives



. 0 if
0 if
0
1
) 0 (
=
=

=
j
j
P
j
.
!
) (
) (
j
e t
t P
t j
j


=
Birth-Death Processes - Example
Pure death process with proportional death rate

j
= 0 for all j,
j
= j > 0 for 1 j N,
j
= 0 otherwise,
and



Then solving the difference-differential equations for this
process gives



. otherwise
if
0
1
) 0 (
N j
P
j
=

=
. 0 ) 1 ( ) ( ) ( N j e e
j
N
t P
j N t j t
j
s s
|
|
.
|

\
|
=

Limiting Probabilities
Now assume that limiting probabilities P
j
exist.
They must satisfy:


or
0 0 1 1
1 1 1 1
0
0 , ) ( 0
P P
j P P P
j j j j j j j


=
> + + =
+ +
(*)
.
0 , ) (
1 1 0 0
1 1 1 1
P P
j P P P
j j j j j j j


=
> + = +
+ +
Limiting Probabilities
These are the balance equations for a birth-death
process. Together with the condition


they uniquely define the limiting probabilities.
, 1
0
=

= j
j
P
Limiting Probabilities
From (*), one can prove by induction that

. , 2 , 1 , 0
1
0
1
0
= =
[

=
+
j P P
j
i
i
i
j

When Do Limiting Probabilities


Exist?
Define


It is easy to show that


if S < . (This is equivalent to the condition P
0
> 0.)
Furthermore, all of the states are recurrent positive, i.e.,
ergodic. If S = , then either all of the states are
recurrent null or all of the states are transient, and
limiting probabilities do not exist.
. 1
1
1
0
1
[

=
+
+ =
j
j
i
i
i
S

1
= S P
o
Flow Balance Method
Draw a closed boundary around state j:







flow in = flow out
j-1
j+1 j

j-1

j+1

j
j j j j j j j
P P P ) (
1 1 1 1
+ = +
+ +
Global balance
equation:
Flow Balance Method
Draw a closed boundary between state j and state j1:
j-1
j+1 j

j-1

j+1

j
j j j j
P P =
1 1
Detailed balance
equation:
Example
Machine repair problem. Suppose there are m machines
serviced by one repairman. Each machine runs without
failure, independent of all others, an exponential time
with mean 1/. When it fails, it waits until the
repairman can come to repair it, and the repair itself
takes an exponentially distributed amount of time with
mean 1/. Once repaired, the machine is as good as
new.

What is the probability that j machines are failed?
Let P
j
be the steady-state probability of j failed
machines.








Example
m m
j j j
P P
P mP
P j m P P j m



=
=
+ = + +

+
1
1 0
1 1
] ) ( [ ) 1 (
j1
j+1 j

j-1
=(mj+1)
j
=(mj)

j
=
j+1
=
Example
j
j
i
j
i
i
i
j
j m m m P
i m
P
P P
) / )( 1 ( ) 1 (
) (
0
1
0
0
1
0
1
0

+ =

=
=
[
[

=
+

=
|
|
.
|

\
|
+ +
=
m
j
j
j m m m
P
1
0
) 1 ( ) 1 ( 1
1

j1
j+1 j

j-1
=(mj+1)

j
=(mj)

j+1
=

j
=

Example
How would this example change if there were m
(or more) repairmen?
Homework
No homework this week due to test next week.
References
1. Erhan Cinlar, Introduction to Stochastic Processes,
Prentice-Hall, Inc., 1975.

2. Leonard Kleinrock, Queueing Systems, Volume I:
Theory, John Wiley & Sons, 1975.

3. Sheldon M. Ross, Introduction to Probability
Models, Ninth Edition, Elsevier Inc., 2007.

Вам также может понравиться