Вы находитесь на странице: 1из 59

4.

Poisson Processes
4.1 Denition
4.2 Derivation of exponential distribution
4.3 Properties of exponential distribution
a. Normalized spacings
b. Campbells Theorem
c. Minimum of several exponential random variables
d. Relation to Erlang and Gamma Distribution
e. Guarantee Time
f. Random Sums of Exponential Random Variables
4.4 Counting processes and the Poisson distribution
4.5 Superposition of Counting Processes
4.6 Splitting of Poisson Processes
4.7 Non-homogeneous Poisson Processes
4.8 Compound Poisson Processes
155
4 Poisson Processes
4.1 Denition
Consider a series of events occurring over time, i.e.

>
Time
0
X X X X
Dene T
i
as the time between the (i 1)
st
and i
th
event. Then
S
n
= T
1
+T
2
+. . . +T
n
= Time to n
th
event.
Dene N(t) = no. of events in (0, t].
Then
P{S
n
> t} = P{N(t) < n}
If the time for the n
th
event exceeds t, then the number of events in (0, t]
must be less than n.
156
P{S
n
> t} = P{N(t) < n}
p
t
(n) = P{N(t) = n} = P{N(t) < n + 1} P{N(t) < n}
= P{S
n+1
> t} P{S
n
> t}
where S
n
= T
1
+T
2
+. . . +T
n
.
Dene Q
n+1
(t) = P{S
n+1
> t}, Q
n
(t) = P{S
n
> t}
Then we can write
p
t
(n) = Q
n+1
(t) Q
n
(t)
and taking LaPlace transforms
p

s
(n) = Q

n+1
(s) Q

n
(s)
157
If q
n+1
(t) and q
n
(t) are respective pdfs.
Q

n+1
(s) =
1 q

n+1
(s)
s
, Q

n
(s) =
1 q

n
(s)
s
and
p

s
(n) =
1 q

n+1
(s)
s

1 q

n
(s)
s
=
q

n
(s) q

n+1
(s)
s
Recall T
1
is time between 0 and rst event, T
2
is time between rst and
second event, etc.
158
Assume {T
i
} i = 1, 2, . . . are independent and with the exception of
i = 1, are identically distributed with pdf q(t). Also assume T
1
has pdf
q
1
(t). Then
q

n+1
(s) = q

1
(s)[q

(s)]
n
, q

n
(s) = q

1
(s)[q

(s)]
n1
and
p

s
(n) =
q

n
(s) q

n+1
(s)
s
= q

1
(s)q

(s)
n1
_
1 q

(s)
s
_
Note that q
1
(t) is a forward recurrence time. Hence
q
1
(t) =
Q(t)
m
and q

1
(s) =
1 q

(s)
sm
p

s
(n) =
q

(s)
n1
m
_
1 q

(s)
s
_
2
159
Assume q(t) = e
t
for t > 0 (m = 1/)
= 0 otherwise.
Then q

(s) = / +s,
1 q

(s)
s
= 1/( +s)
and p

s
(n) =
_

+s
_
n1
_
1
+s
_
2
=
1

( / +s)
n+1
.
p

s
(n) =
1

(/ +s)
n+1
However (/ +s)
n+1
is the LaPlace transform of a gamma distribution
with parameters (, n + 1) i.e.
f(t) =
e
t
(t)
n+11

(n + 1)
for t > 0
160
.. p
t
(n) = L
1
{p

s
(n)} =
e
t
(t)
n
n!
which is the Poisson Distribution. Hence N(t) follows a Poisson
distribution and
P{N(t) < n} =
n1

r=0
p
t
(r) = P{S
n
> t}
P{S
n
> t} =
n1

r=0
e
t
(t)
r
r!
.
We have shown that if the times between events are iid following an
exponential distribution the N(t) is Poisson with E[N(t)] = t.
161
Alternatively if N(t) follows a Poisson distribution, then S
n
has a
gamma distribution with pdf f(t) =
e
t
(t)
n1

(n)
for t > 0.
This implies time between events are exponential.
Since P{S
n
> t} = P{N(t) < n} we have proved the identity
P{S
n
> t} =
_

t
e
t
(t)
n1

(n)
dx =
n1

r=0
e
t
(t)
r
r!
.
This identity is usually proved by using integration by parts.
When N(t) follows a Poisson distribution with E[N(t)] = t, the
set {N(t), t > 0} is called a Poisson Process.
162
4.2 Derivation of Exponential Distribution
Dene P
n
(h) = Prob. of n events in a time interval h
Assume
P
0
(h) = 1 h+o(h); P
1
(h) = h+o(h); P
n
(h) = o(h) for n > 1
where o(h) means a term (h) so that lim
h0
(h)
h
= 0. Consider a nite
time interval (0, t). Divide the interval into n sub-intervals of length h.
Then t = nh.

0 t
h h h h
t = nh
The probability of no events in (0, t) is equivalent to no events in each
sub-interval; i.e.
P
n
{T > t} = P{no events in (0, t)}
T = Time for 1
st
event
163
Suppose the probability of events in any sub interval are independent of
each other. (Assumption of independent increments.) Then
P
n
{T > t} = [1 h +o(h)]
n
= [1
t
n
+o(h)]
n
= (1
t
n
)
n
+n o(h)(1
t
n
)
n1
+. . .
Since
lim
n
(1
t
n
)
n
= e
t
and
lim
n
n o(h) = lim
h0
t
h
o(h) = 0
We have P{T > t} = lim
h0
P
n
{T > t} = e
t
.
.. The pdf of T is
d
dt
P{T > t} = e
t
. (Exponential Distribution)
164
4.3 Properties of Exponential Distribution
q(t) = e
t
t > 0
E(T) = 1/ = m, V (t) = 1/
2
= m
2
q

(s) = / +s
Consider r < t.
Then
P{T > r +t|T > r} = Conditional distribution
=
Q(r +t)
Q(r)
=
e
(r+t)
e
r
= e
t
i.e. P{T > r +t|T > r} = P{T > t} for all r and t.
Also P{T > r +t} = e
(r+t)
= Q(r)Q(t) = Q(r +t)
Exponential distribution is only function satisfying Q(r +t) = Q(r)Q(t)
165
Proof:
Q
_
2
n
_
= Q
_
1
n
_
2
and in general Q
_
m
n
_
= Q
_
1
n
_
m
Q(1) = Q
_
1
n
+
1
n
+. . . +
1
n
_
=
_
Q
_
1
n
__
n
, m = n
.. Q
_
m
n
_
=
_
Q
_
1
n
_
n
_
m/n
= Q(1)
m/n
.
If Q() is continuous or left or right continuous we can write
Q(t) = Q(1)
t
.
Since Q(1)
t
= e
t log Q(1)
we have log Q(1) is the negative of the rate
parameter. Hence
Q(t) = e
t
where = log Q(1).
166
a. Normalized Spacings
Let {T
i
} i = 1, 2, . . . , n be iid following an exponential distribution with
E(T
i
) = 1/.
Dene T
(1)
T
(2)
. . . T
(n)
Order statistics
Then the joint distribution of the order statistics is
f(t
(1)
, t
(2)
, . . . , t
(n)
)dt
(1)
, t
(2)
, . . . t
(n)
= P{t
(1)
< T
(1)
, t
(1)
+dt
(1)
, . . . }
=
n!
1! 1! . . . 1!
= n!e
t
(1)
e
t
(2)
. . . e
t
(n)
dt
(1)
. . . dt
(n)
f(t
(1)
, . . . , t
(n)
) = n!
n
e

n
1
t
(i)
= n!
n
e
S
where S =
n

1
t
(i)
=
n

1
t
i
, 0 t
(1)
. . . t
(n)
167
f(t
(1)
, . . . , t
(n)
) = n!
n
e
S
, 0 t
(1)
. . . t
(n)
S =
n

1
t
(i)
Consider
Z
1
= nT
(1)
, Z
2
= (n 1)(T
(2)
T
(1)
), ,
Z
(i)
= (n i + 1)(T
(i)
T
(i1)
), , Z
(n)
= T
(n)
T
(n1)
.
We shall show that {Z
i
} are iid exponential.
f(Z
1
, Z
2
, . . . , Z
n
) = f(t
(1)
, . . . , t
(n)
)

(t
(1)
, . . . , t
(n)
)
(Z
1
, . . . , Z
n
)

where

(t
(1)
, . . . , t
(n)
)
(Z
1
, . . . , Z
n
)

is the determinant of the Jacobian.


168
We shall nd the Jacobian by making use of the relation

(t
(1)
, . . . , t
(n)
)
(Z
1
, . . . , Z
n
)

(Z
1
, Z
2
, . . . Z
n
)
t
(1)
, . . . , t
(n)
)

1
Z
i
= (n i + 1)(T
(i)
T
(i1)
), T
(0)
= 0
Z
i
T
(j)
=
_

_
n i + 1 j = i
(n i + 1) j = i 1
0 Otherwise
169
(Z
1
, . . . , Z
n
)
(t
(1)
, . . . , t
(n)
=
_

_
n 0 0 0 . . . 0
(n 1) (n 1) 0 0 . . . 0
0 (n 2) (n 2) 0 . . . 0
.
.
.
0 0 . . . . . . 1 1
_

_
170
Note: The determinant of a trangular matrix is the product of the main
diagonal terms
..

(Z
1
, . . . , Z
n
)
(t
(1)
, . . . , t
(n)
)

= n(n 1)(n 2) . . . 2 1 = n!
and
f(z
1
, z
2
, . . . , z
n
) = n!
n
e
S
1
n!
=
n
e

1
z
i
=
n
e
S
as S =
n

i=1
t
(i)
= z
1
+. . . +z
n
.
The spacings Z
i
= (n i + 1)(T
(i)
T
(i1)
) are sometimes called
normalized spacings.
171
Homework:
1. Suppose there are n observations which are iid
exponential (T
i
= 1/). However there are r non-censored observations
and (n r) censored observations all censored at t
(r)
.
Show Z
i
= (n i + 1)(T
(i)
T
(i1)
) for i = 1, 2, . . . , r are iid
exponential.
2. Show that
T
(i)
=
Z
1
n
+
Z
2
n 1
+. . . +
Z
i
n i + 1
and prove
E(T
(i)
) =
1

j=1
1
n j + 1
Find variances and covariances of {T
(i)
}.
172
b. Campbells Theorem
Let {N(t), t > 0} be a Poisson Process. Assume n events occur in the
interval (0, t]. Note that N(t) = n is the realization of a random variable
and has probability P{N(t) = n} = e
t
(t)
n
n!
Dene W
n
= Waiting time for n
th
event.
If {T
i
} i = 1, 2, . . . , n are the random variables representing the time
between events
f(t
1
, . . . , t
n
) =
n
1
e
t
i
=
n
e

n
1
t
i
But
n

1
t
i
= W
n
, hence
f(t
1
, . . . , t
n
) =
n
e
W
n
173
f(t
1
, . . . , t
n
) =
n
e
W
n
Now consider the transformation
W
1
= t
1
, W
2
= t
1
+t
2
, . . . , W
n
= t
1
+t
2
+. . . +t
n
The distribution of W = (W
1
, W
2
, . . . , W
n
) is
f(W) = f(t)

(t)
W

where

(t)
W

is the determinant of the Jacobian.


174
Note:
(W)
t
=
_
_
_
_
_
_
_
_
_
_
_
_
_
1 0 0 0 . . . 0
1 1 0 0 . . . 0
1 1 1 0 . . . 0
1 1 1 1 . . . 0
.
.
.
1 1 1 1 . . . 1
_
_
_
_
_
_
_
_
_
_
_
_
_
and

(t)
W

(W)
t

1
= 1
.. f(w
1
, . . . , w
n
) =
n
e
w
n
0 < w
1
. . . w
n
< t.
But there are no events in the interval (w
n
, t]. This carries probability
e
(tw
n
)
. Hence the joint distribution of the W is
f(W) =
n
e
w
n
e
(tw
n
)
=
n
e
t
175
f (W) =
n
e
t
0 w
1
w
2
. . . w
n
< t
Consider
f(W|N(t) = n) =

n
e
t
e
t
(t)
n
/n!
= n!/t
n
.
This is the joint distribution of the order statistics from a uniform
(0, t) distribution; i.e., f(x) =
1
t
0 < x t.
Hence E(W
i
|N(t) = n) =
it
n + 1
i = 1, 2, . . . , n
We can consider the unordered waiting times, conditional on N(t) = n,
as following a uniform (0, t) distribution.
176
Since w
1
= t
1
, w
2
= t
1
+t
2
, . . . , w
n
= t
1
+t
2
+. . . +t
n
t
i
= w
i
w
i1
(w
0
= 0)
The difference between the waiting times are the original times t
i
. These
times follow the distribution conditional on N(t) = n; i.e.
f(t
1
, . . . , t
n
|N(t) = n) = n!/t
n
Note that if f(t
i
) = 1/t 0 < t
i
< t, the joint distribution
for i = 1, 2, . . . , n of n independent uniform (0, t) random variables
is f(t) = 1/t
n
. If 0 < t
(1)
t
(2)
. . . t
(n)
< t the distribution of the
order statistics is
f(t
(1)
, . . . , t
(n)
) = n!/t
n
which is the same as f(t
1
, . . . , t
n
|N(t)).
177
c. Minimum of Several Exponential Random Variables
Let T
i
(i = 1, . . . , n) be ind. exponential r.v. with parameter
i
and let
T = min(T
1
, . . . , T
n
)
P{T > t} = P{T
1
> t, T
2
> t, . . . , T
n
> t} =
n
i=1
P{T
i
> t}
=
n
i=1
e

i
t
= e
t
, =
n

i
T is exponential with parameter
P{T > t} = e
t
=
n

i=1

i
T = min(T
1
, . . . , T
n
)
If all
i
=
0
, = n
0
, P{T > t} = e
n
0
t
178
Dene N as the index of the random variable which is the smallest failure
time.
For example if T
r
T
i
for all i, then N = r.
Consider P{T > t, T
r
T
i
all i} = P{N = r, T > t}
P{N = r, T > t} = P{T > t, T
i
T
r
, i = r}
=
_

t
P{T > t
r
, i = r | t
r
}f(t
r
)dt
r
=
_

t
e
(
r
)t
r

r
e

r
t
r
dt
r
=
r
_

t
e
t
r
dt
r
=

r

e
t
179
P{N = r, T > t} =

r

e
t
P{N = r, T > 0} = P{N = r} =

r

, =
n

i
P{N = r, T > t} = P{N = r}P{T > t}
N (index of smallest) and T are independent
If
i
=
0
P{N = r} =

0
n
0
=
1
n
(All populations have the same prob. of being the smallest.)
180
D. Relation to Erlang and Gamma Distribution
Consider T = T
1
+. . . +T
n
Since q

i
(s) =

i

i
+s
, q

T
(s) =
n

i=1

i
+s
which is L.T. of Erlang distribution. If
i
are all distinct
q(t) =
n

i=1
A
i
e

i
t
, A
i
=

j=i

j

i
If
i
= , q

T
(s) =
_

+s
_
n
q(t) =
(t)
n1
e
t
(n)
Gamma Distribution
181
E. Guarantee Time
Consider the r.v. following the distribution having pdf
q(t) = e
(tG)
for t > G
= 0 for t G
The parameter G is called a guarantee time
If the transformation Y = T G is made then f(y) = e
y
for y > 0.
.. E(Y ) = 1/, V (Y ) = 1/
2
, . . .
Since T = Y +G, E(T) =
1

+G
and central moments if T and Y are the same.
182
F. Random Sums of Exponential Random Variables
Let {T
i
} i = 1, 2, . . . , N be iid with f(t) = e
t
and consider
S
N
= T
1
+T
2
+. . . +T
N
with P{N = n} = p
n
.
The Laplace Transform of S
N
is (/ +s)
n
for xed N = n. Hence
f

(s) = E
_

+s
_
N
resulting in a pdf which is a mixture of gamma
distributions.
f(t) =

n=1
(t)
n1
e
t
(n)
p
n
183
Suppose p
n
= p
n1
q n = 1, 2, . . . (negative exponential distribution)
f

(s) =

n=1
_

+s
_
n
p
n1
q =

n=1
q
p
_
p
+s
_
n
=
q
p
_
p/ +s
1
p
+s
_
=
q
p

p
s +(1 p)
=
q
s +q
f

(s) =
q
s +q
S
N
= T
1
+T
2
+. . . +T
N
, P{N = n} = p
n1
q
has exponential distribution with parameter (q).
184
4.4 Counting Processes and the Poisson Distribution
Denition: A stochastic process {N(t), T > 0} is said to be a counting
process where N(t) denotes the number of events that have occurred in
the interval (0, t]. It has the properties.
(i.) N(t) is integer value
(ii.) N(t) 0
(iii.) If s < t, N(s) N(t) and N(t) N(s) = number of events
occurring is (s, t].
185
A counting process has independent increments if the events in disjoint
intervals are independent; i.e. N(s) are N(t) N(s) are independent
events.
A counting process has stationary increments if the probability of the
number of events in any interval depends only on the length of the
interval; i.e.
N(t) and N(s +t) N(s)
have the same probability distribution for all s. A Poisson process
is a counting process having independent and stationary
increments.
186
TH. Assume {N(t), t 0} is a Poisson Process. Then the dsitribution of
N
s
(t) = N(s +t) N(s) is independent of s and only depends on the
length of the interval, i.e.
P{N(t +s) N(s)|N(s)} = P{N(t +s) N(s)}
for all s. This implies that knowledge of N(u) for 0 < u s is also
irrelevant.
P{N(t +s) N(s)|N(u), 0 < u s}
= P{N(t +s) N(s)}.
This feature denes a stationary process.
187
TH. A Poisson Process has independent increments.
Consider 0 t
1
< t
2
< t
3
< t
4
>
Time
0
X X X X
t
1
t
2
t
3
t
4
Consider events in (t
3
, t
4
]; i.e.
N(t
4
) N(t
3
)
P{N(t
4
) N(t
3
) | N(u), o < u t
3
}
= P{N(t
4
) N(t
3
)}.
Distribution is independent of what happened prior to t
3
. Hence if the
intervals (t
1
, t
2
] and (t
2
, t
4
) are non-overlapping
N(t
2
) N(t
1
) and N(t
4
) N(t
3
) are independent.
188
TH. Cov(N(t), N(s +t)) = t (Poisson Process)
Proof N(s +t) N(t) is independent of N(t)
Cov(N(s +t) N(t), N(t)) = 0
= Cov(N(s +t), N(t)) V (N(t)) = 0
.. Cov(N(s +t), N(t)) = V (N(t)) = t
as variance of N(t) is t.
An alternative statement of theorem is
Cov(N(s), N(t)) = min(s, t)
189
TH. A counting process {N(t), t 0} is a Poisson Process if and only if
(i) It has stationary and independent increments
(ii) N(0) = 0 and
P{N(h) = 0} = 1 h + 0(h)
P{N(h) = 1} = h + 0(h)
P{N(h) = j} = 0(h), j > 1
Notes: The notation 0(h) little o of h refers to some function (h) for
which
lim
h0
(h)
h
= 0
Divide interval (0, t] into n sub-intervals of length h; i.e. nh = t
P{N(kh) N(k 1)h)} = P{N(h)}
190
T = Time to event beginning at t = 0.
P{T > t} = P{N(t) = 0} = P{No events in each sub-interval}
P{N(t) = 0} = P{T > t} = [1 h +o(h)]
n
= (1 h)
n
+n(1 h)
n1
o(h) +o(h
2
)
= (1 h)
n
{1 +
n o(h)
1 h
+. . . }
=
_
1
t
n
_
n
_
1 +
t
1
t
n
o(h)
h
+. . .
_
e
t
as n , h 0
P{T > t} = e
t
Hence T is exponential; i.e. Time between events is exponential.
{N(t), t 0} is Poisson Process
191
4.5 Superposition of Counting Processes
Suppose there are k counting processes which merge into a single
counting process; e.g. k = 3.
X X
X
X
X X X X X
X
Process 1:
Process 2:
Process 3:
Merged Process:
The merged process is called the superposition of the individual counting
processes
N(t) = N
1
(t) +N
2
(t) +. . . +N
k
(t)
192
A. Superposition of Poisson Processes
N(t) = N
1
(t) +. . . +N
k
(t)
Suppose {N
i
(t), t 0} i = 1, 2, . . . , k are Poisson Processes with
E[N
i
(t)] =
i
t.
Note that each of the counting processes has stationary and independent
increments.
Also N(t) is Poisson with parameter
E(N(t)) =
k

i=1
(
i
t) = t, =
k

i=1

i
N(t) is a Poisson Process
Hence {N(t), t 0} has stationary and independent increments.
193
B. General Case of Merged Process
Consider the merged process from k individual processes
r
r
r
r
r
r
X X
X X X X X X
X
V
k
> <
Merged Process:
The random variable V
k
is the forward recurrence time of the merged
process. We will show that as k , the asymptotic distribution of V
k
is
exponential and hence the merged process is asymptotically a Poisson
Process.
Assume that for each of the processes
Stationary
Multiple occurences have 0 probability
pdf between events of each process is q(t).
194
If q(t) is pdf of time between events for a single process, then each has
the same forward recurrence time distribution with pdf
q
f
(x) = Q(x)/m
With k independent processes there will be
T
f
(1), T
f
(2), . . . , T
f
(k) forward recurrence time random variables
X X
X
X
X X X X X
X
Process 1:
Process 2:
. . .
r r r
Process k:
Merged Process:
V
k
= min(T
f
(1), T
f
(2), . . . , T
f
(k))
P{V
k
> v} = G
k
(v) = P{T
f
(1) > v, T
f
(2) > v, . . . , T
f
(k) > v}
= Q
f
(v)
k
195
G
k
(v) = P{V
k
> v} = Q
f
(v)
k
where
Q
f
(v) =
_

v
q
f
(x)dx, q
f
(x) = Q(x)/m
Let g
k
(x) = pdf of merged process
G
k
(v) =
_

v
g
k
(x)dx = Q
f
(v)
k

d
dv
G
k
(v) = g
k
(v) = kQ
f
(v)
k1
q
f
(v)
g
k
(v) = kQ
f
(v)
k1
Q(v)
m
196
Consider transformation z =
V
k
m/k
=
kV
k
m
,
dz
dv
=
k
m
g
k
(z) = g
k
(v)

V
z

=
k
m
Q
f
_
mz
k
_
k1
Q
_
mz
k
_
m
k
g
k
(z) = Q
_
mz
k
_
_
1
_ mz
k
o
Q(x)
m
dx
_
k1
as Q
f
_
mz
k
_
=
_

mz
k
Q(x)
m
dx = 1
_ mz
k
0
Q(x)
m
dx
For xed z,
as k ,
zm
k
0 and Q
_
mz
k
_
1
197
Also
_ mz
k
0
Q(x)
m
dx
Q
_
mz
k
_
m

mz
k
= Q
_
mz
k
_
z
k

z
k
.. as k
g
k
(z)
_
1
z
k
_
k1
e
z
Thus as k , the forward recurrence time (multiplied by
m
k
) z =
m
k
V
k
is distributed as a unit exponential distribution. Hence for
large k, V
k
=
k
m
z has an asymptotic exponential distribution with
parameter = k/m. Since the asymptotic forward recurrence time is
exponential, the time between events (of the merged process), is
asymptotically exponential.
198
Note: A forward recurrence time is exponential if and only if the time
between events is exponential; ie.
q
f
(x) =
Q(x)
m
= e
x
if Q(x) = e
x
and if q
f
(x) = e
x
Q(x) = e
x
Additional Note: The merged process is N(t) =
k

i=1
N
i
(t). Suppose
E(N
i
(t)) = t. Units of are no. of events per unit time
The units of m are time per event
Thus E(N(t)) = (k)t and (k) is mean events per unit time. The units
of
_
1
k
_
or
_
1

_
is mean time per event. Hence m = 1/ for an
individual process and the mean of the merged process is 1/k.
Ex. = 6 events per year m =
12
6
= 2 months (mean time between
events).
199
5. Splitting of Poisson Processes
Example: Times between births (in a family) follow an exponential
distribution. The births are categorized by gender.
Example: Times between back pain follow an exponential distribution.
However the degree of pain may be categorized as the required
medication depends on the degree of pain.
Consider a Poisson Process {N(t), t 0} where in addition to observing
an event, the event can be classied as belonging to one of r possible
categories.
Dene N
i
(t) = no. of events of type i during (0, t] for i = 1, 2, . . . , r
N(t) = N
1
(t) +N
2
(t) +. . . +N
r
(t)
200
This process is referred to as splitting the process.
Bernoulli Splitting Mechanism
Suppose an event takes place in the interval (t, t +dt]. Dene the
indicator random variable Z(t) = i (i = 1, 2, . . . , r) such that
P{Z(T) = i|event at (t, t +dt]} = p
i
.
Note p
i
is independent of time.
Then if N(t) =
r

i=1
N
i
(t) the counting processes {N
i
(t), t 0} are
Poisson process with parameter (p
i
) for i = 1, 2, . . . , r.
201
Proof: Suppose over time (0, t], n events are observed of which s
i
are
classied as of time i with

r
i=1
s
i
= n.
P{N
1
(t) = s
1
, N
2
(t) = s
2
, . . . , N
r
(t) = s
r
|N(t) = n}
=
n!
s
1
!s
2
! . . . s
r
!
p
s
1
1
p
s
2
2
. . . p
s
r
r
Hence P{N
i
(t) = s
i
, i = 1, . . . , r and N(t) = n}
=
n!

r
i=1
s
i
!
p
s
1
1
p
s
2
2
. . . p
s
r
r
e
t
(t)
n
n!
=
r

i=1
(p
i
t)
s
i
e
p
i
t
s
i
!
=
r

i=1
P{N
i
(t) = s
i
}
which shows that the {N
i
(t)} are independent and follow Poisson
distributions with parameters {p
i
}.
{N
i
(t), t 0} are Poisson Processes.
202
Example of Nonhomogenous Splitting
Suppose a person is subject to serious migraine headaches. Some of these
are so serious that medical attention is required. Dene
N(t) = no. of migraine headaches in (0, t]
N
m
(t) = no. of migraine headaches requiring medical attention
p() = prob. requiring medical attention if
headache occurs at (, +d).
Suppose an event occurs at (, +d); then Prob.of requiring
attention = p()d.
Note that conditional on a single event taking place in (0, t], is uniform
over (0, t]; i.e.
f(|N(t) = 1) = 1/t 0 < t and =
1
t
_
t
0
p()d
203
=
1
t
_
t
0
p()d
0
x

t

No event (, t)
Time to event
.. P{N
m
(t) = k|N(t) = n} =
_
n
k
_

k
(1 )
(nk)
P{N
m
(t) = k, N(t) = n} =
_
n
k
_

k
(1 )
nk
e
t
(t)
n
n!
204
P{N
m
(t) = k} =

n=k
_
n
k
_

k
(1 )
nk
e
t
(t)
n
n!
=

k
k!
e
t

n=k
(t)
n
(n k)!
(1 )
nk
=

k
k!
e
t
(t)
k

n=k
(t)
nk
(1 )
nk
(n k)!
=

k
k!
(t)
k
e
t
e
t(1)
P{N
m
(t) = k} = e
t
(t)
k
k!
205
4.7 Non-homogeneous Poisson Processes
Preliminaries
Let N(t) follow a Poisson distribution; i.e.
P{N(t) = k} = e
t
(t)
k
/k!
Holding t xed, the generating function of the distribution is

N(t)
(s) = E[e
sN(t)
] =

k=0
e
t
(t)
k
k!
e
sk
= e
t

k=0
(e
s
t)
k
k!
= e
t
e
e
s
t

N(t)
(s) = e
t[e
s
1]
= e
t(z1)
if z = e
s
The mean is E[N(t)] = t
206
Consider the Counting Process {N(t), t 0} having the Laplace
Transform
(*)
N(t)
(s) = e
(t)[e
s
1]
= e
(t)[z1]
E[N(t)] = (t), P{N(t) = k} = e
(t)
[(t)]
k
/k!
For the Poisson Process (t) = t and the mean is proportional to t.
However when E[N(t)] = t we call the process {N(t), t 0} a
non-homogenized Poisson Process and E(N(t)] = (t)
(t) can be assumed to be continuous and differentiable
d
dt
(t) =

(t) = (t).
The quantity (t) is called intensity function. (t) can be represented by
(t) =
_
t
0
(x)dx
207
If N(t) has the Transform given by () then
P{N(t) = k} = e
(t)
(t)
k
/k!
Since P{S
n
> t} = P{N(t) < n}
We have P{S
1
> t} = P{N(t) < 1} = P{N(t) = 0}
P(S
1
> t) = e
(t)
Thus pdf of time between events is
f(t) = (t)e

t
0
(x)dx
, (t) =
_
t
0
(x)dx
Note that if H = (t), then H is a random variable following a unit
exponential distribution.
208
Assume independent increments; i.e. N(t +) N() and N() are
independent
L.T. Transform (z, t) = e
(t)[z1]
z = e
s
Generating function = E[e
sN(t)
] = E[z
N(t)
]
e
(t+u)(z1)
= E[z
N(t+u)
] = E[z
N(t+u)N(u)+N(u)
]
= E[z
N(t+u)N(u)
] E[z
N(u)
]
= e
(u)[z1]
.. = E[z
N(t+u)N(u)]
] =
e
(t+u)(z1)
e
(u)(z1)
= e
[(t+u)(u)][z1]
where (t +u) (u) =
_
t+u
u
(x)dx
.. P{N(t +u) N(u) = k} =
e
[(t+u)(u)]
[(t +u) (u)]
k
k!
209
Axiomatic Derivation of
Non-Homogenized Poisson Distribution
Assume counting process {N(t), t 0}
(i) N(0) = 0
(ii) {N(t), t 0} has independent increments; i.e. N(t +s) N(s)
and N(s) are independent
(iii) P{N(t +h) = k|N(t) = k} = 1 (t)h + 0(h)
P{N(t +h) = k + 1|N(t) = k} = (t) + 0(h)
P{N(t +h) = k +j|N(t) = k} = o(h) j 2
p{N(t +s) N(s) = k} = e
[(t+s)(s)]
2 [(t +s) (s)]
k
k!
210
4.8 Compound Poisson Process
Example. Consider a single hypodermic needle which is shared. The
times between use follow a Poisson Process. However at each use, several
people use it. What is the distribution of total use?
Let {N(t), t 0} be a Poisson process and {Z
n
, n 1} be iid random
variables which are independent of N(t). Dene
Z(t) =
N(t)

n=1
Z
n
The process Z(t) is called a Compound Poisson Process. It will be
assumed that {Z
n
} takes on integer values.
211
Dene A

(s) = E[e
sz
n
]. Then
(s|N(t) = r) = E[e
sZ(t)
] = A

(s)
r
(s|N(t) = r) = A

(s)
r
(s) =

r=0
(s|N(t) = r)P(N(t) = r)
=

r=0
A

(s)
r
e
t
(t)
r
r!
= e
t

r=0
(A

(s)t)
r
r!
(s) = e
t
e
A

(s)t
= e
t(1A

(s))
A

(s) = E(e
sz
N
) = 1 sm
1
+
s
2
2
m
2
+. . .
t(1 A

(s)) = t[sm
1

s
2
2
m
2
+. . . ], m
i
= E(z
i
n
)
212
Cumulant function = K(s) = log (s)
K(s) = sm+
s
2
2

2
+. . .
where (m,
2
) refer to Z(t).
K(s) = t[sm
1

s
2
2
m
2
+. . . ]
E[Z(t)] = tm
1
V [Z(t)] = tm
2
m
i
= E(z
i
n
)
213

Вам также может понравиться