Вы находитесь на странице: 1из 11

UCSD ECE153 Handout #26

Prof. Young-Han Kim Thursday, May 1, 2014

Solutions to Homework Set #4


(Prepared by Fatemeh Arbabjolfaei)

1. Two envelopes. An amount A is placed in one envelope and the amount 2A is placed in
another envelope. The amount A is fixed but unknown to you. The envelopes are shuffled
and you are given one of the envelopes at random. Let X denote the amount you observe in
this envelope. Designate by Y the amount in the other envelope. Thus
(
(A, 2A), with probability 21 ,
(X, Y ) =
(2A, A), with probability 12 .

You may keep the envelope you are given, or you can switch envelopes and receive the amount
in the other envelope.

(a) Find E(X) and E(Y ).


  Y 
(b) Find E X
Y and E X .
(c) Suppose you switch. What is the expected amount you receive?

Solution:

(a) The expected amount in the first envelope is


X 1 1 3
E(X) = xpX (x) = A + (2A) = A.
2 2 2
x∈X

Since you are given one of the envelopes at random, the expectation is the same for
the envelope you are not given. Thus the expected amount in the second envelope is
E(Y ) = 32 A.
(b) The expected factor by which the amount in the second envelope exceeds the amount in
the first is

  X  
X x 1 A 1 2A 1 5
E = pXY (x, y) = + = +1= .
Y y 2 2A 2 A 4 4
(x,y)∈X ×Y

  X y
Y 1 A 1 2A 1 5
E = pXY (x, y) = + = +1= .
X x 2 2A 2 A 4 4
(x,y)∈X ×Y

(c) If you switch, the expected amount you will receive is E(Y ) = 32 A.

1
2. Tall trees. Suppose that the average height of trees on campus is 20 feet. Argue that no more
than half of the tree population is taller than 40 feet.
P
Solution: The average height of the trees in the population is 20 feet. So n1 ni=1 hi = 20,
where n is the population size and hi is the height of the i-th tree. If more than half of the
population is at least 40 feet tall, then the average will be greater than 12 · 40 = 20 feet since
each person is at least 0 feet tall. Thus no more than half of the population is 40 feet tall.
Alternatively, we can use the Markov inequality with respect to the fraction of population to
obtain the same result.

3. Random phase signal. Let Y (t) = sin(ωt + Θ) be a sinusoidal signal with random phase
Θ ∼ Unif[−π, π]. Assume here that ω and t are constants. Find the mean and variance of
Y (t). Do they depend on t?

Solution: We have
Z π
1
E(Y (t)) = sin(ωt + θ)dθ = 0,
−π 2π
Z π
1
Var(Y (t)) = E(Y 2 (t)) − (E(Y (t)))2 = E(Y 2 (t)) = sin2 (ωt + θ)dθ
−π 2π
Z π
1 1 1
= − cos(2ωt + 2θ)dθ
2π −π 2 2
1
= .
2
So neither the mean or variance depend on t.

4. Iterated expectation. Let Λ and X be two random variables with


 5 2
3λ , 0 ≤ λ ≤ 1
3
Λ ∼ fΛ (λ) =
0, otherwise,

and X|{Λ = λ} ∼ Exp(λ). Find E(X).

Solution: Since X|(Λ = λ) ∼ Exp(λ), we have


1
E(X|Λ = λ) = .
λ
Thus,
  Z ∞ Z 1
1 1 1 5 2 5
E[X] = E(E(X|Λ)) = E = fΛ (λ) dλ = · λ 3 dλ = .
Λ −∞ λ 0 λ 3 2

5. Sum of packet arrivals. Consider a network router with two types of incoming packets,
wireline and wireless. Let the random variable N1 (t) denote the number of wireline packets
arriving during time (0, t] and let the random variable N2 (t) denote the number of wireless

2
packets arriving during time (0, t]. Suppose N1 (t) and N2 (t) are independent Poisson with
pmfs
(λ1 t)n −λ1 t
P{N1 (t) = n} = e for n = 0, 1, 2, . . .
n!
(λ2 t)k −λ2 t
P{N2 (t) = k} = e for k = 0, 1, 2, . . . .
k!
Let N (t) = N1 (t) + N2 (t) be the total number of packets arriving at the router during time
(0, t].

(a) Find the mean E(N (t)) and variance Var(N (t)) of the total number of packet arrivals.
(b) Find the pmf of N (t).
(c) Let the random variable Y be the time to receive the first packet of either type. Find
the pdf of Y .
(d) What is the probability that the first received packet is wireless?

Solution:

(a) Since N1 and N2 are independent,

EN = EN1 + EN2 = λ1 t + λ2 t = (λ1 + λ2 )t.

and
Var(N ) = Var(N1 ) + Var(N2 ) = (λ1 + λ2 )t.
(b) In fact, as discussed in class, N (t) is Poisson itself. To see this,
n
X
P{N = n} = P{N1 = k}P{N2 = n − k}
k=0
n
X (λ1 t)k (λ2 t)n−k −λ2 t
= e−λ1 t e
k! (n − k)!
k=0
Xn  
n e−(λ1 +λ2 )t
= (λ1 t)k (λ2 t)n−k
k n!
k=0
((λ1 + λ2 )t)n −(λ1 +λ2 )t
= e
n!
for n = 0, 1, 2, . . . .
(c) Since N (t) is Poisson with parameter (λ1 +λ2 )t, the time to the first packet is exponential
with parameter λ1 + λ2 (refer to the note below to see why). Therefore,
(
(λ1 + λ2 )e−(λ1 +λ2 )y , y ≥ 0,
fY (y) =
0, otherwise.

Alternatively, we can let X1 and X2 denote the times until first packets of each type and
see that Y = min(X1 , X2 ).

3
Note: Let N (t) denote the number of packets arriving during time (0, t] and let X be
the time of the arrival of the first packet. Then

P{X > t} = P{N (t) = 0}

If N (t) is a Poisson r.v. with parameter λt, then for t ≥ 0 we have

P{X > t} = P{N (t) = 0} = e−λt

Therefore,

FX (t) = P{X ≤ t} = 1 − P{X > t} = 1 − e−λt

and
(
λe−λt , t ≥ 0,
fX (t) =
0, otherwise.

Hence, the time of the arrival of the first packet is exponential with parameter λ.
(d) Let X1 and X2 denote the times until first packets of each type. According to the note
in the previous part, X1 and X2 are exponential random variables with parameters λ1
and λ2 respectively. Therefore, the probability that the first packet is wireless is simply
P{X2 < X1 } = λ1λ+λ2
2
(refer to First available teller problem).

4
Solutions to Additional Exercises

1. Mean and variance. Let X and Y be random variables with joint pdf
( √
1 if |x| + |y| ≤ 1/ 2
fX,Y (x, y) =
0 otherwise

Define the random variable Z = |X| + |Y | . Find the mean and variance of Z without first
finding the pdf of Z.

Solution: We have

E(Z) = E(|X|) + E(|Y |),


Var(Z) = E(Z 2 ) − (E(Z))2 = E(X 2 ) + 2E(|XY |) + E(Y 2 ) − (E(|X|) + E(|Y |))2 .

So to find the mean and variance of Z without first finding the pdf of Z, all we need to know
are E(|X|), E(|Y |), E(X 2 ), E(Y 2 ), and E(|XY |). We have
Z 1 Z 1 Z 1 √

2

2
−|x| √
2 √ 2
E(|X|) = |x|fX,Y (x, y)dydx = |x|( 2 − 2|x|)dx = ,
− √1 − √1 +|x| − √1 6
2 2 2
Z √1 Z √1
−|x| Z √
√1
2 2 2
2 2 1
E(X ) = x fX,Y (x, y)dydx = x2 ( 2 − 2|x|)dx = ,
− √1 − √1 +|x| − √1 12
2 2 2
Z 1
√ Z √1 −|x| Z √1  2
2 2 2 1 1
E(|XY |) = |xy|fX,Y (x, y)dydx = |x| √ − |x| dx = .
− √1 1
− √ +|x| −√ 1 2 24
2 2 2


2 1
By symmetry, E(|Y |) = E(|X|) = 6 , E(Y 2 ) = E(X 2 ) = 12 . Hence, we have
√ √ √
2 2 2
E(Z) = E(|X|) + E(|Y |) = + = ,
6 6 3
Var(Z) = E(Z 2 ) − (E(Z))2 = E(X 2 ) + 2E(|XY |) + E(Y 2 ) − (E(|X|) + E(|Y |))2
√ !2
1 1 1 2 1
= +2× + − = .
12 24 12 3 36

5
2. Inequalities. Label each of the following statements with =, ≤, or ≥. Justify each answer.
1
 1 
(a) E(X 2 ) vs. E X 2 .

(b) (E(X))2 vs. E(X 2 ).


(c) Var(X) vs. Var(E(X|Y )).
(d) E(X 2 ) vs. E((E(X|Y ))2 ).

Solution:
(a) By Cauchy-Schwarz inequality, we have
  2  
1 2 1
1= E X ≤ E(X )E .
X X2
Thus, 1/E(X 2 ) ≤ E(1/X 2 ).
(Or we can use Jensen’s inequality with convex function g(x) = 1/x for x > 0).
(b) We have
Var(X) = E(X 2 ) − (E(X))2 ≥ 0.
Thus, E(X 2 ) ≥ (E(X))2 .
(Or we can use Jensen’s inequality with the convex function g(x) = x2 ).
(c) We will, in fact, show that
Var(X) = Var(E(X|Y )) + E[Var(X|Y )]
so that Var(X) ≥ Var(E(X|Y )) (since E[Var(X|Y )] ≥ 0). To show this, consider that
the conditional variance, by definition, is equal to:
Var(X|Y ) = E[(X − E(X|Y ))2 |Y ]
By following the same steps as in the unconditional case, we may write:
Var(X|Y ) = E(X 2 |Y ) − (E(X|Y ))2
and by taking expectations:
E[Var(X|Y )] = E[E(X 2 |Y )] − E[(E(X|Y ))2 ] = E(X 2 ) − E[(E(X|Y ))2 ].
Also by the definition of the variance:
Var(E(X|Y )) = E[(E(X|Y ))2 ] − (E(X))2
since E(X) = E[E(X|Y )]. By adding the previous two expressions:
Var(X) = Var(E(X|Y )) + E[Var(X|Y )].

(d) From above, we know that Var(X) ≥ Var(E(X|Y )). But Var(X) = E(X 2 ) − (E(X))2
and Var(E(X|Y )) = E[(E(X|Y ))2 ] − (E(X))2 , so we have
E(X 2 ) − (E(X))2 ≥ E[(E(X|Y ))2 ] − (E(X))2 ,
which implies E(X 2 ) ≥ E[(E(X|Y ))2 ].

6
3. Cauchy–Schwarz inequality.

(a) Prove the following inequality: (E(XY ))2 ≤ E(X 2 )E(Y 2 ). (Hint: Use the fact that for
any real t, E((X + tY )2 ) ≥ 0.)
(b) Prove that equality holds if and only if X = cY for some constant c. Find c in terms of
the second moments of X and Y .
(c) Use the Cauchy–Schwarz inequality to show the correlation coefficient |ρX,Y | ≤ 1.
p p p
(d) Prove the triangle inequality: E((X + Y )2 ) ≤ E(X 2 ) + E(Y 2 ).

Solution:

(a) Consider the following quadratic equation in the parameter t :

0 = E((X + t Y )2 ) = E(X 2 ) + 2 tE(XY ) + t2 E(Y 2 ) .

Since it is the expected value of a nonnegative random variable, E((X + t Y )2 ) ≥ 0 . If


E((X + t Y )2 ) > 0 there are two imaginary solutions, while if E((X + t Y )2 ) = 0 there is
one real solution. Thus the discriminant must satisfy

4(E(XY ))2 − 4E(X 2 )E(Y 2 ) ≤ 0 ,

which we can rewrite as


(E(XY ))2 ≤ E(X 2 )E(Y 2 ) .
Here is another proof. Let s
E(X 2 )
t=± .
E(Y 2 )
Plugging these value into E((X + t Y )2 ) ≥ 0, we obtain
p p
−E(XY ) ≤ E(X 2 )E(Y 2 ) and E(XY ) ≤ E(X 2 )E(Y 2 )

Combining the two inequalities for E(XY ) yields


p
|E(XY )| ≤ E(X 2 )E(Y 2 ) ⇒ (E(XY ))2 ≤ E(X 2 )E(Y 2 ) ,

which is what we set out to prove.


(b) If X = cY then

(E(cY · Y ))2 = c2 (E(Y 2 ))2 = E((cY )2 )E(Y 2 ) = E(X 2 )E(Y 2 ) .

and similarly for Y = cX. Conversely, if (E(XY ))2 = E(X 2 )E(Y 2 ) then the discriminant
of the quadratic equation in part (a) is 0. Therefore E((X + t Y )2 ) = 0, which means
that X + t Y = 0 with probability 1. Hence X = −t Y with probability 1. Clearly,
s
E(X 2 )
t=± .
E(Y 2 )

A similar quadratic equation can be formed with E((t X + Y )2 ) = 0 to prove that


Y = −t X is another possible solution.

7
(c) The square of correlation coefficient is by definition

Cov2 (X, Y )
ρ2X,Y = .
Var(X)Var(Y )

If we define random variables U = X − E(X) and V = Y − E(Y ), then

(E(U V ))2
ρ2X,Y = ≤ 1,
E(U 2 )E(V 2 )

where the inequality follows from the Cauchy–Schwarz inequality. Therefore |ρX,Y | ≤ 1 .
(d) By the Cauchy–Schwarz inequality,
p
(E(XY ))2 ≤ E(X 2 )E(Y 2 ) =⇒ E(XY ) ≤ E(X 2 )E(Y 2 ) .

Therefore

E((X + Y )2 ) = E(X 2 ) + 2E(XY ) + E(Y 2 )


p p
≤ E(X 2 ) + 2 E(X 2 ) E(Y 2 ) + E(Y 2 )
p p 2
= 2
E(X ) + E(Y ) . 2

4. Let X and Y have correlation coefficient ρX,Y .

(a) What is the correlation coefficient between X and 3Y ?


(b) What is the correlation coefficient between 2X and −5Y ?

Solution:

(a) By definition of correlation coefficient, we have

Cov(X, 3Y )
ρX,3Y = p
Var(X)Var(3Y )
E(X · 3Y ) − E(X)E(3Y )
= p
Var(X) · 9Var(Y )
3(E(XY ) − E(X)E(Y ))
= p = ρX,Y .
3 Var(X)Var(Y )

(b) Similarly, we have

Cov(2X, −5Y )
ρ2X,−5Y = p
Var(2X)Var(−5Y )
E(−10XY )) − E(2X)E(−5Y )
= p
4Var(X) · 25Var(Y )
−10(E(XY ) − E(X)E(Y ))
= p = −ρX,Y .
10 Var(X)Var(Y )

8
5. Conditioning on an event. Let X be a r.v. with pdf
(
2(1 − x) for 0 ≤ x ≤ 1
fX (x) =
0 otherwise

and let the event A = {X ≥ 1/3}. Find fX|A (x), E(X|A), and Var(X|A).

Solution: By definition ( fX (x)


P{X∈A} x∈A
fX|A (x) =
0 otherwise.
Thus for the given X and A,
(
9
fX|A (x) = 2 (1 − x) x∈A
0 otherwise.

Now, using this conditional pdf


Z 1
9 5
E(X|A) = x(1 − x) dx = .
1/3 2 9

And Z 1
9 2 1
E(X 2 |A) = x (1 − x) dx = .
1/3 2 3
Thus, Var(X|A) = E(X 2 |A) − (E(X|A))2 = 2/81.

6. Packet switching. Let N be the number of packets per unit time arriving at a network
switch. Each packet is routed to output port 1 with probability p and to output port 2 with
probability 1 − p, independent of N and of other packets. Let X be the number of packets
per unit time routed to output port 1. Thus
 (
0 N =0 1 packet i routed to Port 1
X = PN where Zi =
Z
i=1 i N > 0 0 packet i routed to Port 2,

and Z1 , Z2 , . . . , ZN are conditionally independent given N . Suppose that N ∼ Poisson(λ),


i.e., has Poisson pmf with parameter λ.

(a) Find the mean and variance of X.


(b) Find the pmf of X and the pmf of N − X .

Solution:

(a) By iterated expectation and law of conditional variance, we have

E(X) = EN [ E(X | N ) ] ,
Var(X) = EN (Var(X | N )) + VarN (E(X | N )).

9
Given N = n, X | {N = n} ∼ Binom(n, p). So E(X | N = n) = n p, Var(X|N = n) =
n p(1 − p).
Using the fact N ∼ Poisson(λ), and E(N ) = λ, Var(N ) = λ, we have

E(X) = EN [ E(X | N ) ] = E(N p) = λp ,


Var(X) = EN (Var(X | N )) + VarN (E(X | N ))
= E(N p(1 − p)) + Var(N p) = λp(1 − p) + λp2
= λp.

(b) Given N = n, X | {N = n} ∼ Binom(n, p). So


 n k n−k ,
P(X = k | N = n) = k p (1 − p) n ≥ k,
0, otherwise.

By the law of total probability, we have



X
P(X = k) = P(X = k | N = n)P(N = n)
n=0
X∞  
n k
= p (1 − p)n−k P(N = n)
k
n=k
X∞
n! λn −λ
= pk (1 − p)n−k e
k!(n − k)! n!
n=k

−λ k 1 X λn
=e p (1 − p)n−k
k! (n − k)!
n=k

X
1 λn−k
= e−λ pk λk
(1 − p)n−k
k! (n − k)!
n=k
X∞
1 k (λ(1 − p))m
= e−λ pk λ
k! m!
m=0
(λp)k
= e−λ eλ(1−p)
k!
(λp)k −λp
= e .
k!
So X ∼ Poisson(λp).
Similarly by defining
(
1, packet i routed to Port 2
Yi =
0, packet i routed to Port 1,

we have 
0, N =0
N −X = PN
i=1 Yi , N > 0 .

10
With similar calculation, we get N − X ∼ Poisson(λ(1 − p)). The pmf of N − X is

(λ(1 − p))k −λ(1−p)


P(N − X = k) = e .
k!
In fact, we can show that X and N − X are independent.

11

Вам также может понравиться