Вы находитесь на странице: 1из 2

ECE 534 RANDOM PROCESSES FALL 2011

PROBLEM SET 4 Due Tuesday, October 18


4. Random processes, including Poisson, Wiener, Markov, and martingale processes
Assigned Reading: Chapter 4 of the notes.
Reminder: Exam 1, covering lectures, reading, and homework for problem sets 2 and 3 will be held on
Monday, October 10, 7-8:15 p.m., in Room 269 Everitt Lab.. You may bring a sheet of notes, two-sided, font
size 10 or larger or equivalent handwriting size, to consult during the exam. Otherwise the exam is closed
notes.
Problems to be handed in:
1 A randomly scaled logistic function
Let X
t
=

tU
V

, where () =
1
1+e

, and U and V are independent random variables such that U has


the N(0, 1) distribution and V has the exponential distribution with parameter one.
(a) Briey describe the possible sample paths of X.
(b) In which of the sense(s), a.s., m.s., p., or d., does lim
t
X
t
exist?
(c) Show that E[X
t
] is strictly increasing with E[X
0
] = 0.5, lim
t
E[X
t
] = 1, and lim
t
E[X
t
] = 0.
(d) Is (X
t
: t 0) a stationary random process? Justify your answer.
(e) Is (X
t
: t 0) a Markov process? Justify your answer.
2 A simple model of a neural spike train
Consider a counting process (N
t
: t 0) that models the spike times of a neuron. The sodium ions of a
neuron need to be replenished between spikes, so one spike suppresses the intensity of additional spikes for
a short time. The particular example we use is the following. The intensity of spikes at time t, denoted by

t
, is given by

t
=

1 if N(t) = N(t 1)
1
4
if N(t) > N(t 1)
In words, the intensity is one if there have been no spikes within the last unit of time, and the intensity
is 0.25 otherwise. For initialization purposes, suppose there were no spikes in the interval [1, 0] (i.e. set
N(t) = 0 for 1 t 0.) Some thought shows that, for this particular model, the times between spikes are
independent and identically distributed.
(a) Find the distribution of the time between two consecutive spikes. (Hint: The theory of failure rate
functions is helpfulsee ECE 313 notes.)
(b) Find the long term average rate that spikes happen (by the law of large numbers, it is equal to one over
the mean time between spikes).
3 A compound Poisson process with mean zero
Let J
1
, J
2
, . . . be independent and identically distributed with mean zero, variance
2
J
, and characteristic
function
J
(u), and let (N
t
: t 0) be a Poisson counting process with rate > 0, which is statistically
independent of the Js. Dene Y
t
for t 0 by Y
t
=

N(t)
i=1
J
i
.
(a) Show that E[Y
t
] = 0 for t 0. (Hint: E[Y
t
] is the average of E[Y
t
|N
t
= n] over n using the pmf of N
t
.)
(b) Briey explain why Y has independent increments. (Note: (a), (b), and the fact Y
0
= 0 imply (Y
t
: t 0)
is a martingale.)
(c) Express E[Y
2
t
] and the characteristic function E[e
juYt
] for t 0 in terms of ,
2
J
, and
J
. (Hint: See
hint for part (a).)
(d) Show that (Z
t
: t 0) dened by Z
t
= Y
2
t
ct is a martingale for some constant c, and identify c. (Hint:
For s < t, Y
t
= Y
s
+(Y
t
Y
s
). It suces to show that E[Z
tn+1
|Y
t1
, . . . , Y
tn
] = Z
tn
whenever t
1
< < t
n+1
.)
1
4 Hitting the corners of a triangle
Consider a discrete-time Markov process (X
k
: k 0), with state space {1, 2, 3, 4, 5, 6}. Suppose the states
are arranged in the triangle shown,
6
1
2
3 4 5
and given X
k
= i, the next state X
k+1
is one of the two neighbors of i, selected with probability 0.5 each.
Suppose P{X
0
= 1} = 1.
(a) Let
B
= min{k : X
k
{3, 4, 5}}. So
B
is the time the base of the triangle is rst reached. Find E[
B
].
(b) Let
3
= min{k : X
k
= 3}. Find E[
3
].
(c) Let
C
be the rst time k 1 such that both states 3 and 5 have been visited by time k. Find E[
C
].
(Hint: Use results of (a) and (b) and symmetry.)
(d) Let
R
denote the rst time k
C
such that X
k
= 1. That is,
R
is the rst time the process returns to
vertex 1 of the triangle after reaching both of the other vertices. Find E[
R
]. (Hint: Use results of (c) and
(b) and symmetry.)
5 Marginal distributions for a continuous-time three state Markov process
Consider a continuous time Markov process for the transition rate diagram shown, for positive constants
and .

1 0 2


(a) Give the generator matrix Q.
(b) Find the equilibrium probability distribution, ().
(c) Find the marginal distribution (t) for t 0, for the initial distribution (0) = (1, 0, 0). (Hint: Focus on
dierential equations for
1
(t) and
0
(t)
2
(t).)
6 Conditioning a Gauss-Markov process
Let X = (X
t
: t R) denote a mean zero, stationary Gaussian process with R
X
() = e
||
. Note that X
is a Markov process. This problem focuses on the conditional distribution of X given X
0
= 0. Since all the
coordinates of X are jointly Gaussian, the conditional distribution of X is also the distribution of a Gaussian
random process, and it is uniquely determined by the mean function
o
(t)

= E[X
t
|X
0
= 0] and correlation
function R
o
X
(s, t)

= Cov(X
s
, X
t
|X
0
). Here, Cov(X
s
, X
t
|X
0
) represents the covariance between X
s
and X
t
computed under the joint conditional distribution of X
s
and X
t
given X
0
, and by an important property of
joint Gaussian distributions it does not depend on the value of X
0
.
(a) Find
o
X
(t) for t R.
(b) Find R
o
X
(s, t) for s, t R.
(b) Is the conditional distribution Markovian? That is, is a Gaussian random process with mean function

o
X
and correlation function R
o
X
Markov? Justify your answer.
2

Вам также может понравиться