Вы находитесь на странице: 1из 12

7 Introduction to counting processes

7.1 Some de…nitions


In this chapter we study the stochastic process describing fNt gt 0 where Nt
represents the number of events hapening in the interval (0; t]: It must be a
counting process.

De…nition 1 A counting process is a stochastic process, in discrete or con-


tinuous time, with state space f0; 1; 2; :::g with the property that Nt is a non-
decreasing function of t:

Most of the processes of interest are Markovian counting processes, i.e.

De…nition 2 A counting Markov process is a counting process in which Nt


satis…es
PrfNt = kjNt1 = x1 ; Nt2 = x2 ; : : : ; Ntn = xn g =

= PrfNt = kjNtn = xn g; for t1 < t2 < ::: < tn < t

This means that if you know the total number of accidents in the last s
years, you may forecast the future, as if you knew exactly when the accidents
happened.
For a Markovian counting process, the probabilities of greatest interest
are the transition probabilities, given by

pk;k+n (s; t) = PrfNt Ns = njNs = kg; 0 s < t; k; n = 0; 1; 2; :::

When these probabilities depend only on t s the process is called ho-


mogeneous.
Assuming that N0 = 0; the marginal distribution of Nt is

PrfNt = ng = pn (t) = p0;n (0; t):

The marginal distribution of the increment Nt Ns is


X
1
PrfNt Ns = n) = PrfNt Ns = njNs = kg PrfNs = kg =
k=0
X
1
= pk;k+n (s; t)pk (s):
k=0

When the last probability depends only on t s the process has stationary
increments.

1
De…nition 3 A nonhomogeneous birth process is a counting Markov process
such that, for all h ! 0+

i. pk;k+1 (t; t + h) = k (t)h + o(h); k = 0; 1; ::

ii. pk;k+n (t; t + h) = o(h); k = 0; 1; ::; n = 2; 3; :::

The homogeneous birth process is a special case of the above when k (t)
is independent of t: A very special case is the Poisson process, which arises
when k (t) is a constant .

7.2 The (Homogeneous) Poisson Process


There are may possible ways of de…ning an homogeneous Poisson process.
One is the one just seen (a particular case of a homogeneous birth process).
An equivalent de…nition is the following:

De…nition 4 A counting process fN (t)gt 0 , with N (0) = 0; is an homogen-


eous Poisson process, or just a Poisson process, with intensity , if it satis…es
the following postulates:

(i) fN (t)gt 0 has independent increments;

(ii) fN (t)gt 0 has stationary increments;


1
(iii) 8 h ! 0+ , PrfN (h) 1g = h + o(h);

(iv) 8 h ! 0+ , PrfN (h) 2g = o(h).

Theorem 1 If fN (t)gt>0 is a Poisson process, then the random variable


N (t) is Poisson distributed with mean t; 8t > 0:

Proof. Let
pk (t) = PrfN (t) = kg; k = 0; 1; 2; : : : :
Using (i) and (ii)

p0 (t + h)= PrfN (t + h) = 0g = PrfN (t) = 0; N (t + h) N (t) = 0g


(1)
=p0 (t)p0 (h) = p0 (t)(1 p(h));
1
Any function f is an in…nitesimal with h and is denoted o(h) when

f (h)
limh!0 =0
h

2
8 t; h > 0, with p(h) = PrfN (h) 1g. Subtracting P0 (t) and dividing by h,
both sides of (1) we get

p0 (t + h) p0 (t) p(h)
= p0 (t) :
h h
Letting h ! 0 we get the di¤erential equation

p00 (t) = p0 (t): (2)

In a similar way for k 1,

X
k
pk (t + h) = pk (t)p0 (h) + pk 1 (t)p1 (h) + pk i (t)pi (h): (3)
i=2

By de…nition p0 (h) = 1 p(h). Postulate (iv) implies that

p1 (h) = p(h) + o(h) and

X
k X
k (4)
pk i (t)pi (h) pi (h) = o(h);
i=2 i=2

because pk (t) 1. Then, using (4) we get (3)

X
k
pk (t + h) pk (t) = pk (t)[p0 (h) 1] + pk 1 (t)p1 (h) + pk i (t)pi (h)
i=2
X
k
= pk (t)p(h) + pk 1 (t)p1 (h) + pk i (t)pi (h)
i=2
= pk (t)h + pk 1 (t)h + o(h):

hence
pk (t + h) pk (t)
! pk (t) + pk 1 (t) when h ! 0
h
and we obtain

p0k (t) = pk (t) + pk 1 (t); k = 1; 2; : : : : (5)

Let
X
1
N (t)
PN (t) (z) = E z = z k pk (t);
k=0

3
the probability generating function of N (t). Then

@ X 1
PN (t) (z) = z k p0k (t);
@t k=0

i.e. using (2) and (5),

@ X
1 X
1
PN (t) (z) = z k pk (t) + z k pk 1 (t) = (z 1)PN (t) (z):
@t k=0 k=1

But, the solution to the di¤erential equation


@
PN (t) (z) = (z 1)PN (t) (z)
@t
is of the type
t(z 1)
PN (t) (z) = C(z)e :
Attending to the initial conditions p0 (0) = 1 and pk (0) = 0 for k 1, then
PN (0) (z) = 1, which implies that C(z) = 1. So
t(z 1)
PN (t) (z) = e ; (6)

which is the probability generating function of a Poisson r.v. with mean t.


In other words, for each t, N (t) follows a Poisson distribution with parameter
t, i.e.
e t ( t)k
pk (t) = ; k = 0; 1; : : : : (7)
k!

Note that
[ (t s)]n e (t s)
pk;k+n (s; t) = ; n = 0; 1; :::
n!
The parameter is called mean rate or intensity of the process and
represents the mean number of events per unit of time.
When we withdraw postulate (ii) and (iii) is replaced, in such a way that
the intensity of the process depends on t, we obtain the nonhomogeneous
Poisson process. When (iv) is taken out we obtained the generalised Poisson
process.

4
Discussion of the postulates
(i) Excludes chain reactions. A …re can originate another …re. This dif-
…culty may be, sometimes, overtaken rede…ning the risk unit. This is
the case of …re insurance. But not the case for contagious diseases or
epidemics.
(ii/iii) There are situations where they are not veri…ed. An example is when
there are seasonably involved. In some cases time may be divided into
subintervals, to obtain subprocesses with di¤erent intensities.
If we are only interested in the number of claims over a …nite time
interval the Poisson distribution remains valid, even when there is a
deterministic tendency on the claim frequency.
(iv) This di¢ culty may be overtaken. For example an accident involving
two cars, insured in the same company, and when both drivers are
considered responsible, may be considered as just one claim.

Characteristics of the Poisson process


As the Poisson process has independent and stationary distribution the law
of N (t) for all t describes entirely the process. As we have proved that N (t)
is a Poisson random variable with mean t, we know already the generating
functions of N (t).
r
MN (t) (r)=e t(e 1) ;
(8)
t(z 1)
PN (t) (z) =e :
The factorial moments are
E[N (t)(N (t) 1) : : : (N (t) k + 1)] = ( t)k ; k = 1; 2; : : : ;
from where
E[N (t)] = t;

E[N 2 (t)]= t + ( t)2 ; (9)

E[N 3 (t)]= t + 3( t)2 + ( t)3 :


and
E[N (t)] = N (t) = t;

2
Var[N (t)] = N (t) = t; (10)
p
N (t) = 1= t:

5
Considering that for s < t, in a process with independent increments
Cov[N (s); N (t)]=Cov[N (s); N (t) N (s) + N (s)]

=Cov[N (s); N (t) N (s)] + Cov[N (s); N (s)]

=Var[N (s)];
i.e.
Cov[N (s); N (t)] = Var[N (min(s; t))]; 8s; t; (11)
then for the Poisson process we must have
Cov[N (s); N (t)] = min(s; t); 8s; t: (12)
We can easily show the sum of two independent Poisson processes is still
a Poisson process.
Let Wk be the time of the kth event, with k = 1; 2; : : :. The di¤erence
Tk = Wk+1 Wk represents the time between events k and k + 1. The
variables Tk represent the times of permanence of the process in state k.
Theorem 2 The interarrival times, Tk in a Poisson process are i.i.d. ran-
dom variables exponential distributed with mean 1= .
Proof.
PrfTk > zjWk = sg= PrfN (Wk + z) N (Wk ) = 0jWk = s)g

= PrfN (s + z) N (s) = 0g = PrfN (z) = 0g = e z;

On the other hand


PrfTk+1 > r; Tk > zg= PrfTk+1 > r; Tk > zjWk = sg
= PrfTk+1 > rjTk > z; Wk = sg PrfTk > zjWk = sg
= PrfTk+1 > rjWk+1 > z + sg PrfTk > zjWk = sg
= PrfTk+1 > rg PrfTk > zg
which implies that Tk and Tk+1 are independent. The proof for Tk and Tk+l
is analogue.

Hence Wn is a Gamma random variable with parameters (n; 1= ); which


allows us to calculate the distribution function of a Gamma when the shape
parameter is integer as
X1
( t)k e t X
n 1
( t)k e t
PrfWn tg = PrfN (t) ng = =1
k=n
k! k=0
k!

6
The Poisson process is also related with the binomial distribution since
for s< t and k n,
PrfN (s) = k; N (t) N (s) = n kg
Pr fN (s) = kjN (t) = ng =
PrfN (t) = ng
(e s =k!)[e (t s) (t s)n k =(n k)!]
s k
=
e t (tn =n!)
n k
n s k t s
= :
k t t

In a similar way for two independent Poisson processes with fN1 (t); t 0g
an fN2 (t); t 0g with intensities 1 and 2 ,
k n k
n 1 2
PrfN1 (t) = kjN1 (t) + N2 (t) = ng = :
k 1 + 2 1 + 2

Another distribution related to the Poisson process is the uniform. For


s < t,

PrfN (s) = 1; N (t) N (s) = 0g


Pr fW1 sjN (t) = 1g =
PrfN (t) = 1g
PrfN (s) = 1g PrfN (t s) = 0g s
= = :
PrfN (t) = 1g t
It can be proved that (see Parzen, page 140):

Theorem 3 If in a Poisson Process N (t) = n; than the times where the


events happen (arrival times) have the same distribution as the order stat-
istics corresponding to n independent random variables uniformly distributed
on the interval (0; t].

7.3 The nonhomogeneous Poisson process


De…nition 5 A nonhomogeneous Poisson Process fN (t)gt 0 , with N (0) =
0 and intensity function (t) is a counting process satisfying the following
postulates:

(i) The process has independent increments;

(ii) 8 h ! 0+ , PrfN (t + h) N (t) = 1g = (t)h + o(h) with (t) 0;

(iii) 8 h ! 0+ , PrfN (t + h) N (t) 2g = o(h);

7
Theorem 4 In a nonhomogeneous Poisson process, the number of events
that happen
R s+t in the interval (s; s+t] are Poisson distributed with mean m(s; s+
t) = s ( )d .

Proof. Let p k (s; s + t) = PrfN (s + t) N (s) = kg be the probability that


k events happen in the interval s to s + t. We get

p0 (s; s + t + h) = p0 (s; s + t)[1 (s + t)h + o(h)]: (13)

Subtracting P0 (s; s + t) to both sides of (13), dividing by h and letting h !


0+ , we get
@
p0 (s; s + t) = (s + t)p0 (s; s + t); (14)
@t
subject to P0 (s; s) = 1. In a similar way for k 1we get
@
p (s; s + t) = (s + t)pk (s; s + t) + (s + t)pk (s; s + t); (15)
@t k
under conditions p k (s; s) = 0 for k 1.
Proof. Multiplying equation k by z k and adding in k, we obtain
@
PN (t) (z; s; t) = (s + t)(z 1)PN (t) (z; s; t); (16)
@t
under the condition PN (t) (z; s; s) = 1, which solution is

PN (t) (z; s; t) = em(s;s+t)(z 1)


;

which is the probability generating function of a Poisson with mean m(s; s +


t).
Of course that the homogeneous Poisson process can be obtained from this
one with ( ) = , 8 . A common intensity function used is (t) = e t ,
with and positive and to be estimated.

7.4 The mixed Poisson process


Instead of assuming that the number of claims is a Poisson process with
intensity , we suppose that is the result of the observation of a non-
negative random variable, . Let

U ( ) = Prf g

the cumulative distribution function of . The r.v. is called structure r.v.


and U ( ) structure distribution.

8
De…nition 6 The unconditional counting process, fN (t); t 0g, with N (0) =
0 and such that
Z 1
PrfN (t + s) N (s) = kg= PrfN (t + s) N (s) = kj gdU ( )
0
Z 1
(17)
t( t)k
= e dU ( );
0 k!
is called a mixed Poisson process.

From (17) it is clear that the increments of the process are stationary and
Z 1
e t ( t)k
pk (t) = PrfN (t) = kg = dU ( ); (18)
0 k!

i.e. N (t) is a mixed Poisson distribution.


Let us calculate the transition probabilities:

pk;k+n (s; t) = PrfN (t) N (s) = njN (s) = kg =


PrfN (t) N (s) = n; N (s) = kg
= =
PrfN (s) = kg
Z 1
1
= PrfN (t) N (s) = n; N (s) = kj gdU ( ) =
pk (s) 0
Z 1
1
= PrfN (t) N (s) = nj g PrfN (s) = kj gdU ( ) =
pk (s) 0
Z 1
1 e (t s) [ (t s)]n e s ( s)k
= dU ( ) =
pk (s) 0 n! k!
Z
1 (t s)n k 1
= s e t n+k dU ( ) =
pk (s) n!k! 0
Z
1 (t s) s (n + k)! 1 e t ( t)n+k
n k
= dU ( )
pk (s) n!k! tn+k 0 (n + k)!
k+n s k s n pk+n (t)
= 1 :
n t t pk (s)

From the last equation and the distribution of the increments we can
conclude that the increments are not independent.
The mixed Poisson process is a process with stationary, but not independ-
ent increments. Although is has stationary increments it is not homogeneous.

9
And the conditional probabilities, for 0 < s < t; and 0 k n

PrfN (t) = n + kjN (s) = kg PrfN (s) = kg


PrfN (s) = kjN (t) = n + kg = =
PrfN (t) = n + kg
k+n s k s n pk+n (t)
pk;n+k (s; t)pk (s) n t
1 t pk (s) k
p (s)
= = =
pn+k (t) pn+k (t)
k+n s k s n
= 1 ;
n t t

which is again a binomial. This result, is shared with the Poisson Process.
Theorem 3 remains also valid.
It is normal to consider that the mixed Poisson process is the Bayesian
version of the Poisson process where U is the a priori distribution of the
intensity of the process. The a posteriori distribution of the intensity is
Z x
k
e t dU ( )
U (x) = Prf xjN (t) = kg = Z 01 ; (19)
k
e t dU ( )
0

and the Bayes estimator for , is the a posteriori expected value, i.e.
R 1 k+1 t
e dU ( )
E[ jN (t) = k] = 0R 1 k t : (20)
0
e dU ( )

It can be shown that the mixed Poisson process is a nonhomogeneous


birth process with
k (t) = E[ jN (t) = k]:

When t is …xed N (t) is a mixed Poisson random variable. Then

PN (t) (z) = E z N (t) = E E z N (t) j =E e t(z 1)


= M (t (z 1))

MN (t) (r) = M (t (er 1)) (21)


E[N (t)(N (t) 1) : : : (N (t) k + 1)] = tk E[ k
]; k = 1; 2; : : : ;
E[N (t)] = tE[ ];

E[N 2 (t)] = tE[ ] + t2 E[ 2


]; (22)

E[N 3 (t)] = tE[ ] + 3t2 E[ 2


] + t3 E[ 3
]:

10
E[N (t)] = tE[ ];

Var[N (t)] = tE[ ] + t2 Var[ ];


(23)
tE[ ] + 3t2 Var[ ] + t3 3( )
N (t) = 3=2
:
(Var[N (t)])

7.4.1 The Polya process


A particular case of the mixed Poisson process is the Polya process, which is
obtained when is Gamma distributed, i.e.
1 = r 1
u( ) = re ; > 0;
(r)
where r > 0 is the shape parameter and > 0 is the scale parameter.
R1 t k
pk (t) = 0 e k!( t) (r)1 r e = r 1 d

r+k 1 r k
1 t
= 1+ t 1+ t
;
k

which is a negative binomial with parameters r and t.

PN (t) (z) = (1 t (z 1)) ; (24)


and
MN (t) (r) = (1 t (er 1)) : (25)

pk;k+n (s; t) = PrfN (t) N (s) = njN (s) = kg =


k+n s k s n pk+n (t)
= 1 =
n t t pk (s)
+k+n 1 r k+n
1 t
k+n s k s n k+n 1+ t 1+ t
= 1 =
n t t +k 1 1
r
s
k

k 1+ s 1+ s

n r+k
r+k+n 1 (t s) 1+ s
=
n 1+ t 1+ t
which is a negative binomial with r replaced by r + k and replaced by
t s
s+1=
:

11
Note that in the Polya process, as
Z 1 Z 1
k t k 1 r 1
e dU ( ) = e t re
=
d =
0 0 (r)
Z 1
1
= r+k 1
e =( 1+t ) d =
0 (r) r
r+k
(r + k) Z 1
1+t r+k 1 =( 1+t ) 1
= r e r+k
d =
(r) 0
(r + k) 1+t
r+k
(r + k) 1+t (r + k) k
= r =
(r) (r) (1 + t )r+k

and attending to (20) we have that


R1 k+1 t
0
e dU ( )
k (t) = E[ jN (t) = k] = R 1 k
=
e t dU ( )
0
(r+k+1) k+1

(r) (1+t )r+k+1


= (r+k) k =
(r) (1+t )r+k
1
= (r + k) = (r + k) 1 :
1+t +t

This is a nonhomogeneous birth process with positive linear contagion.

12

Вам также может понравиться