Вы находитесь на странице: 1из 10

UC Berkeley

Department of Electrical Engineering and Computer Science


EE 126: Probablity and Random Processes
Problem Set 8
Fall 2007
Issued: Thursday, October 25, 2007

Due: Friday, November 2, 2007

Reading: Bertsekas & Tsitsiklis, 4.44.7


Problem 8.1
Let X1 , X2 , . . . be independent and identically distributed random variables, where each Xi
is distributed according to the logarithmic PMF with parameter p i.e.,
pX (k) =

(1 p)k
,
k ln(1/p)

k = 1, 2, 3, . . . ,

where 0 < p < 1. Discrete random variable N has the Poisson PMF with parameter i.e.,
pN (k) = e

k
,
k!

k = 0, 1, 2, . . . ,

where > 0.
(a) Determine the transform MX (s) associated with each random variable Xi . Hint: You
may find the following identity useful:
Provided 1 < a 1,
(b) Defining Y =

PN

i=1 Xi ,

ln(1 + a) = a

a2 a3 a4
+

+ ...
2
3
4

determine the transform MY (s).

Solution:
1. The Transform of pX (k) =

(1p)k
k ln(1/p)

MX (s) =

is
k

X
(esk )(1 p)

k ln(1/p)

k=1

X (es (1 p))k
1
=
ln(1/p)
k
k=1

ln(1 es (1 p))
=
ln p
where we used

ln(1 ) = +

2
2

+ ........ =

2. The Transform of a Poisson Random Variable

k
k=1 k

pN (k) = e k!
s 1)

MN (s) = E[esN ] = e(e


1

for 1 < 1
is given by

Using the Law of Iterated Expectation, the Transform of Y=

PN

i=1 Xi

is

MY (s) = E[esY ] = E[E[esY |N ]]


= E[E[es

PN

i=1

Xi

|N ]]

= E[MX (s)N ]
= MN (s)|es =MX (s)
s 1)

= e(e

es =MX (s)
ln(1es (1p))
(
1)
ln p

=e

lnp ( ln(1es (1p))+ln p)

=e

s (1p))+ln p)

= [e( ln(1e

lnp

lnp

= [e ln(1e (1p)) eln p ]


p

=[
] ln p
s
1 e (1 p)

Problem 8.2
N male-female couples at a dinner party play the following game. Each member of the couple,
writes his/her name on a piece of paper (so there are a total of 2N pieces of paper). Then
men throw their paper into hat A, while the women throw their paper in hat B. Once all
the papers are in, the host draws a piece of paper from each hat, and the two people chosen
are dance partners for the rest of the night. The host continues in this fashion until all 2N
poeple are paired up for night. Let M be the number of couples that are reunited by this
game. Find E[M ] and V ar(M ). Hint: set up indicator variables and use covariance.
Solution:
We will use indicator functions to solve this problem. Define
P variables {Xi } such that
Xi = 1 if couple i is reunited, and is 0 otherwise. Clearly M = N
i=1 Xi . We know that:
E[M ] =

N
X

E[Xi ] = N E[Xi ]

i=1

Now we easily see that: E[Xi ] = N1 and therefore E[N ] = 1. Finding the variance is a bit
more tricky since the Xi are not independent. Thus we have:
V ar(M ) =

N
X

V ar(Xi ) + 2

i=1

Cov(Xi , Xj )

i<j

Now,
V ar(Xi ) = E[Xi2 ] E[Xi ]2 = E[Xi ] E[Xi ]2 =

1
N 1
1
2 =
.
N
N
N2

Furthermore,
Cov(Xi , Xj ) = E[Xi Xj ] E[Xi ]E[Xj ] =
2

1 1
1
1

= 2
N N 1 N2
N (N 1)

Thus we find:
V ar(X) =
=
=
=

 
N 1
N
1
N
+2
2
2
N
2 N (N 1)
N 1 2 N (N 1)
1
+
N
2
N 2 (N 1)
1
N 1
+
N
N
1

Problem 8.3
Suppose that the random variables X1 , X2 have joint density function:


1 2
(x1 6x1 x2 + 10x22 ) , for < x1 , x2 < .
fX1 ,X2 (x1 , x2 ) = exp
2
(a) Find x1 , x2 , Cov(x1 , x2 ) and the correlation coefficient .
(b) Find E[x2 |x1 = 3].
(c) For y1 = ax1 + bx2 and y2 = cx2 find a, b, c such that = 0, y21 = y22 = 1.
Solution:
1. To find the means, the covariance, and the correlation coefficient, we need to compare
given pdf to the generic bivariate normal PDF:



1
(x2 x2 )2
1
x1 x1 x2 x2
1
(x1 x1 )2
p
exp

2
2 1 2
x21
x1
x2
x22
2x1 x2 1 2
and we find: x1 = x2 = 0. (For more background on the bivariate normal random
variable, please refer to chapter 4 of the course notes.) Therefore:


x1 x1 x2 x2
(x2 x2 )2
1
(x1 x1 )2
2

+
= x21 6x1 x2 + 1 x22

1 2
x21
x1
x2
x22
1
1
2
1
1

= 1,
= 6,

= 10
1 2 x21
(1 2 )x1 x2
1 2 x22

3
x1 = 10, x2 = 1, = .
10
and Cov(x1 , x2 ) = x1 x2 = 3.
2. To find the conditional mean of x2 given x1 = 3 we use the formula:
E[x2 |x1 ] = x2 +

x2
(x1 x1 )
x1

and we find:
E[x2 |x1 = 3] =
3

9
.
10

3. We want to find a, b, c such that = 0, y21 = y22 = 1. We can see that:


E[y1 ] = E[y2 ] = 0.
Thus we have y22 = E[y22 ] and for this to equal 1, we need c = 1. We also have:
y22

E[y12 ] = E[a2 x21 + 2abx1 x2 + b2 x22 ] = 1

10a2 + 6ab + b2 = 1, and


Cov(y1 , y2 )

a E[x1 x2 ] + b E[x22 ] = 3a + b

b = 3a
10a2 18a2 + 9a2 = 1 a = 1

a = 1, b = 3, c = 1

a = 1, b = 3, c = 1

a = 1, b = 3, c = 1

a = 1, b = 3, c = 1

Problem 8.4
The receiver is an optical communications system uses a photodetector that counts the number of photons that arrive during the communication session. (The time of the communication
session is 1 time unit.)
The sender conveys information by either transmitting or not transmitting photons to the
photodetector. The probability of transmitting is p. If she transmits, the number of photons
X that she transmits during the session has a Poisson PMF with mean per time unit. If
she does not transmit, she generates no photons.
Unfortunately, regardless of whether or not she transmits, there may still be photons
arriving at the photodetector because of a phenomenon called shot noise. The number N of
photons that arrive because of the shot noise has a Poisson PMF with mean . N and X
are independent. The total number of photons counted by the photodetector is equal to the
sum of the transmitted photons and the photons generated by the shot noise effect.
(a) What is the probability that the sender transmitted if the photodetector counted k photons?
(b) Before you know anything else about a particular communication session, what is your
least squares estimate of the number of photons transmitted?
(c) What is the least squares estimate of the number of photons transmitted by the sender
if the photodetector counted k photons?
(d) What is the best linear predictor for the number of photons transmitted by the sender
as a function of k, the number of the detected photons?
Solution:

1. Let A be the event that the sender transmitted, and K be the number of photons
counted by the photodetector. Using Bayes rule,
P(A | K = k) =

pK|A (k)P(A)
pX+N (k) p
=
pK (k)
pN (k) (1 p) + pX+N (k) p

The discrete random variables X and N are given by the following PMFs:
pX (x) =

x e
, x0
x!

pN (n) =

n e
, n0
n!

The sum of two independent Poisson random variables is also Poisson, with mean equal
to the sum of the means of each of the random variables. This fact can be derived by
looking at the product of the transforms of X and N . Therefore:
pX+N (k) =

( + )k e(+)
, k0
k!

Thus,
P(A | K = k) =

p
p

(+)k e(+)
k!

(+)k e(+)
k!

+ (1 p)

k e
k!

=
1+

1p
p

1


k

2. Let S be the number of photons transmitted by the sender. Then with probability p,
S = X, and with probability 1 p, S = 0. The least squares estimate of the number of
photons transmitted by the sender is simply the mean, in the absence of any additional
information:
S1 = E[S] = p + (1 p) 0 = p.
3. The least squares predictor has a form
S2 (k) = E[S | K = k].
Using Bayes rule,
S2 (k) =

k
X
s=0

spS|K (s|k) =

k
k
X
pK|S (k|s)pS (s)
1 X
=
spK|S (k|s)pS (s).
s
pK (k)
pK (k)
s=0

s=0

From the definitions of S and K, the following are true:



(1 p) + pe , s = 0
pS (s) =
s e
p s!
,
s = 1, 2, . . .
pK|S (k|s) = pN (k s) =
pK (k) = p

(ks) e
(k s)!

( + )k e(+)
k e
+ (1 p)
k!
k!
5

In order to obtain the last expression, we observe that K = S + N with probability p,


and K = N with probability 1 p. Substituting into the formula above,
"
#
k
s e (ks) e
(k0) e
X
1

S2 (k) =
+
sp
0 (1 p)
pK (k)
(k 0)!
s!
(k s)!
s=0

s 
(ks)
k
k X
k!

( + )
pe e
s
=
pK (k)
k!
s!(k s)! +
+
s=0




1
( + )k

pe(+) ( + )k
=
pe e
k
=k
pK (k)
k!
+
+
k!pK (k)


1

= k

k .
+

1p
1 + p + e
Thus
S2 (k) =

k
+

1+

1p
p

1


k

.
e

Note that as k increases, the estimator can be approximated by

k
+ .

4. The linear least squares predictor has the form


Cov(S, K)
(k E[K])
S3 (k) = E[S] +
2
K

(1)

Note that since X and N are independent, S and N are also independent.
E[S] = p
E[S 2 ] = pE[X 2 ] + (1 p)(0) = p(2 + )
S2
= E[S 2 ] (E[S])2 = p(2 + ) (p)2 = p(1 p)2 + p.

E[K]
2
K

= E[S] + E[N ] = p +
2
= S2 + N
= p(1 p)2 + p + .

Finally, we need to find Cov(S, K).


Cov(S, K) = E[(S E[S])(K E[K])]
= E[(S E[S])(S E[S] + N E[N ])]
= E[(S E[S])(S E[S])] + E[(S E[S])(N E[N ])]
= S2 + E[(S E[S])(N E[N ])]
= S2
= p(1 p)2 + p.
Note that we have used the fact that (S E[S]) and (N E[N ]) are independent, and
E[(S E[S])] = 0 = E[(N E[N ])].
6

Therefore, substituting all the numbers into the equation above, we get the linear
predictor:
p(1 p)2 + p
(k p ) .
S3 (k) = p +
p(1 p)2 + p +
Problem 8.5
Let X and Y be two random variables with positive variances, associated with the same
experiment.
(a) Show that , the correlation coefficient of X and Y , equals to 1 if and only if there
exists a constant b and a negative constant a such that Y = aX + b.
(b) Let XL be the linear least mean squares estimator of X based on Y . Show that
E[(X XL )Y ] = 0.
Use this property to show that the correlation of the estimation error X XL with Y is
zero.
= E[X|Y ] be the least mean squares estimator of X given Y . Show that
(c) Let X

E[(X X)h(Y
)] = 0,
for any function h.
(d) Is it true that the estimation error X E[X|Y ] is independent of Y ?
Solution:
(a) First suppose there exists a constant b and a negative constant a such that Y = aX + b.
= E[U V ] = E[

X E[X] Y E[Y ]
]
X
Y

Now, Y = aX + b, E[Y ] = aE[X] + b and var(Y ) = a2 var(X). Therefore, Y = aX


(since a < 0). We obtain
= E[

X E[X] (aX + b aE[X] b)


X E[X] X E[X]
] = E[
]
X
aX
X
X
= E[U 2 ] = E[U 2 ] = 1.

Now we do the proof in the other direction. Suppose that = E[U V ] = 1. First
we will show that E[(U E[U V ]V )2 ] = 0. We observe that E[U ] = E[V ] = 0 and
E[U 2 ] = E[V 2 ] = 1. Then,
E[(U E[U V ]V )2 ] = E[U 2 ] 2(E[U V ])2 + (E[U V ])2 E[V 2 ]
= 1 (E[U V ])2 = 0.
7

R R
However, By the definition of expection, we have E[(U E[U V ]V )2 ] = (u
E[U V ]v)2 fU,V (u, v)dudv. The integrand 0 because it is a squared term. Therefore,
the integrand must be zero in order for the integration to be zero. Therefore U =
]
Y
E[U V ]V = V . Thus XE[X]
= Y E[Y
, which means Y = aX + b where a = X
and
X
Y
Y
b = E[Y ] X E[X]. This completes the proof.
L = E[X] +
(b) X

cov(X,Y )
(Y
2
Y

E[Y ]), so we have

L )Y ] = E[XY (E[X] + cov(X, Y ) (Y E[Y ]))Y ]


E[(X X
Y2
cov(X, Y ) 2
(Y Y E[Y ])]
= E[XY E[X]Y
Y2
= E[XY ] E[X]E[Y ]

cov(X, Y )E[Y 2 ] cov(X, Y )E[Y ]2


+
Y2
Y2

= cov(X, Y )[1

E[Y 2 ] E[Y ]2
+
]
Y2
Y2

= cov(X, Y )[1

Y2
] = 0.
Y2

L with Y . Now
Now, let us examine the correlation of the estimation error X X
L ,Y )
cov(XX
L , Y ) = 0.
= Y . So we must show that cov(X X
XXL

L , Y ) = E[(X X
L )Y ] E[X X
L ]E[Y ]
cov(X X
L ]E[Y ]
= E[X X
L ] = E[X E[X] cov(X, Y ) (Y E[Y ])]
E[X X
Y2
cov(X, Y )
= E[X] E[X]
E[Y E[Y ]]
Y2
= 0.
Hence, we have proven the desired property.

Now applying the linearity


(c) E[(X X)h(Y
)] = E[(X E[X|Y ])h(Y )], by definition of X.
of expectations, we get
E[Xh(Y )] E[E[X|Y ]h(Y )] = E[Xh(Y )] E[E[Xh(Y )|Y ]]
= E[Xh(Y )] E[Xh(Y )] = 0
where the first equality is obtained by noticing that E[X|Y ] is taken with respect to X
thus allowing h(Y ) to be pulled into the expectation. The second equality results from
the law of iterated expectations.

(d) No. Consider the following counterexample (as in problem 8.3). Let X and Y be discrete
random variables with the following joint pmf
 1
4 for (x, y) = (1, 0), (0, 1), (1, 0), (0, 1)
pX,Y (x, y) =
0 otherwise.
Notice that E[X|Y = y] = 0, for all feasible y. So, E[X|Y ] = 0. More precisely, E[X|Y ]
is a random variable that equals zero with probability one. So, we have X E[X|Y ] = X
(where the equality refers to equality in distribution). X and Y are not independent, so
X E[X|Y ] is not independent of Y .
Problem 8.6
Let random variables X and Y have the bivariate normal PDF
(
)
x2 2xy + y 2
1
, < x, y <
fX,Y (x, y) = p
exp
2 (1 2 )
2 1 2

where denotes the correlation coefficient between X and Y .


(a) Determine the numerical values of E[X], var(X), E[Y ] and var(Y ).
p
(b) Show that X and Z = (Y X)/ 1 2 are independent normal random variables, and
determine the numerical values of E[Z] and var(Z).
(c) Deduce that

 1
\
1
P {X > 0} {Y > 0} = +
sin1
4 2

Solution
1. Two random variables X and Y are said to be jointly normal if they can be expressed
in the form
X = aU + bV
Y = cU + dV
where U and V are independent normal random variable (Section 4.7 in text) The
bivariate normal PDF of X and Y has the form
 (xE[X])2 2(xE[X])(yE[Y ]) (yE[Y ])2 

+
2
2
X Y
1
X
Y
p
fX,Y (x, y) =
exp
,

2 (1 2 )
2 1 2 X Y
< x, y <

2 , 2 are the mean and variance of X and Y.


where E[X],E[Y ], X
Y

By inspection, E[X] = E[Y ] = 0 and var(X) = var(Y ) = 1 for


(
)
x2 2xy + y 2
1
, < x, y <
fX,Y (x, y) = p
exp
2 (1 2 )
2 1 2
9

2. From part (a), X is normal since it is a linear combination of two independent normal
random variables. Since Z is a linear combination of X and Y , it can also be expressed
as a linear combination of independent random variables U and V. Therefore, Z is also
normal. To show that X and Z are independent, we need to show that cov(X, Z) = 0.
(Independence implies zero covariance, but the converse is not true except for normal
random variables).
X
Y
p
Z=p
1 2
1 2
E[Z] = p

E[Y ] p

E[X] = 0
1 2
1 2
2
2 X
Y2
2cov(X, Y )
1
22
2 1
1 2

+
=

+
=
=1
Z2 =
1 2
1 2
1 2
1 2 1 2 1 2
1 2

Solving for the covariance of X and Z, we have


cov(X, Z) = E[(X E[X])(Z E[Z])]
Y
X
= E[XZ] = E[X ( p
p
)]
2
1
1 2
E[XY ]
E[X 2 ]
=p
p
1 2
1 2
E[XY ]
1
=p
p
2
1
1 2
E[XY ]
E[XY ]
=p
p
2
1
1 2
=0
Therefore, X and Z are independent.
Y X
, if Y > 0, then Z > X 2 . Therefore we have,
3. Since Z =
2
1

X
P(X > 0, Y > 0) = P(X > 0, Z > p
)
1 2
Z
Z
(x2 +z 2 )
1
=
exp 2
dzdx
2
X>0 Z> X
12

Changing to polar coordinates yields


Z Z
2
r 2
1
P(X > 0, Y > 0) =
exp 2 rdrd
= r=0 2
Z
2 1

=
d
where = tan1 ( p
) = sin1
1 2
2
1
1
sin1
= +
4 2

10

Вам также может понравиться