Вы находитесь на странице: 1из 8

Carnegie Mellon University Department of Statistics

36-705 Intermediate Statistics Homework #1


Solutions

September 1, 2016

Problem 1: Wasserman 1.3


Let be a sample space and let A1 , A2 , . . . be events. Define Bn =
i=n Ai and Cn = i=n Ai .

(a) Show that B1 B2 and that C1 C2 .

Proof. (By Induction)


Base.

B1 = Ai = A1 Ai = A1 B2 B1 B2
i=1 i=2
Now for any k N,

Bk = Ai = Ak Ai = Ak Bk+1 Bk Bk+1 .
i=k i=k+1

Proof. (By Induction)


Base.

C1 = Ai = A1 Ai = A1 C2 C1 C2
i=1 i=2
Now for any k N,

Ck = Ai = Ak Ai = Ak Ck+1 Ck Ck+1 .
i=k i=k+1

(b) Show that


n=1 Bn if and only if belongs to an infinite number of the events
A1 , A 2 , . . . .

Proof.
" Suppose belongs to an infinite number of the events A1 , A2 , . . . but
n=1 i=n Ai .
Then there exists a j such that i=j Ai . But this implies can be in at most j 1 of the
Ai , a contradiction. Therefore,
n=1 i=n Ai .

" Assume
n=1 i=n Ai . If does not belong to an infinite number of the events
A1 , A2 , . . . , then pick the last (according to the index) Ai which contains and call it AT .
Then Ak , k > T . Hence, n=1 Bn , a contradiction. Therefore, belongs to an infinite
number of the events A1 , A2 , . . . .

(c) Show that n=1 Cn if and only if belongs to all the events A1 , A2 , . . . except possibly
a finite number of those events.

1
36-705 Intermediate Statistics: Homework 1

Proof.
" Suppose that belongs to all the events A1 , A2 , . . . except possibly a finite number of
those events. Pick the last Ai that is not in and call the next one An0 . An , n n0 so
clearly
i=n0 Ai . Since this set is part of the union n=1 Cn , weve shown n=1 Cn .

" Suppose
n=1 i=n Ai . Suppose further that no matter how high we run our index,
we can always find an Ai such that Ai . More precisely, for any k N, j > k such that
Aj . This of course implies
i=n Ai for any n. That is, n=1 i=n Ai , a contra-
diction. Therefore, belongs to all the events A1 , A2 , . . . except possibly a finite number of
those events.

Problem 2: Wasserman 1.8


Suppose that P(Ai ) = 1 for each i. Prove that


P Ai = 1.
i=1

Proof. We will prove the equivalent proposition:


P Aci = 0.
i=1

In order to show this note that:


P Aci P(Aci ) = 0 = 0.
i=1 i=1 i=1

Problem 3: Wasserman 1.13


Suppose that a fair coin is tossed repeatedly until both a head and tail have appeared at least
once.

(a) Describe the sample space .


We can partition the sample space into two subsets

A = { HH T, n 1}

n times
B = { T T H, n 1}

n times

Then = A B.

2
36-705 Intermediate Statistics: Homework 1

(b) What is the probability that three tosses will be required?


This can occur in two ways: HHT or TTH. Each has probability ( 12 ) so
3

1 1
P (three tosses) = 2 = .
23 4

Problem 4: Wasserman 2.1


Show that
P(X = x) = F (x+ ) F (x ).
We have by right continuity of the CDF F (x+ ) = F (x) = P(X x). Also by continuity of
probabilities
F (x ) = lim F (x 1/n) = lim P(X x 1/n) = P(X < x).
n n

And these observations together with the fact that P(X = x) = P(X x) P(X < x) imply
the desired result.

Problem 5: Wasserman 2.4


Let X have probability density function


1/4 0 < x < 1



fX (x) = 3/8 3 < x < 5




0 otherwise.

(a) find the cumulative distribution function of X.


x<0



0


0x<1



x/4
FX (x) = 1/4 1x3 .






3x7
3<x5



8

1 x>5

(b) Let Y = 1/X. Find the probability density function fY (y) for Y . Hint: Consider three
cases: 51 y 13 , 31 y 1, and y 1.

3
36-705 Intermediate Statistics: Homework 1

Following the hint, let us first consider the case when y = 1/x 1, which occurs exactly when
0 < x 1. Keep in mind X takes values in the interval (, 0] with probability 0. Now in
this interval we have
1
fX (x) = 0<x<1
4
and Y = g(X) = 1/X is indeed monotonic (therefore h(y) = g 1 = 1/y) so we can directly
employ the transformation formula for p.d.f.s. That is,
RRR R R R
dh(y) RRRR 1 RRRR 1 RRRR
fY (y) = fX (h(y))RRRRR
1
R= R R= , y 1.
RRR dy RRRRR 4 RRRR y 2 RRRR 4y 2
R R
Using the same method, we first note that 51 y 13 corresponds exactly with 3 x 5 (we
can drop the endpoints since were dealing with continuous probability here). For this interval
we have
3
fX (x) = 1 < x < 3,
8
RRR R R R
and therefore
R
R dh(y) RRRR 3 RRRR 1 RRRR 3 1 1
fY (y) = fX (h(y))RRR RRR = RRR 2 RRR = 2 , <y< .
RRR dy RRR 8 RRR y RRR 8y 5 3

Now for the only remaining interval, 1


3 < y < 1, we need only note that this corresponds exactly
to the interval 1 < x < 3 and that
P (X (1, 3)) = 0.
Hence,
P (Y (1/3, 1)) = 0.
So altogether,




3 1
<y< 1

1
8y 2 5 3
fY (y) = 4y2 y>1





0 otherwise.

Problem 6: Wasserman 2.7


Let X and Y be independent and suppose that each has a Uniform(0, 1) distribution. Let
Z = min{X, Y }. Find the density fZ (z) for Z. Hint: It might be easier to first find P(Z > z).

FZ (z) = P(Z z) = 1 P(Z > z) = 1 P(X > z)P(Y > z) = 1 (1 z)2 = 2z z 2 .



2 2z 0<z<1
fZ (z) = FZ (z) =


0 otherwise.

4
36-705 Intermediate Statistics: Homework 1

Problem 7: Wasserman 2.20


Let Z = X Y . We can easily see that Z [1, 1]. To calculate FZ (z) we first consider the
case 1 z 0:
1+z 1 z2 1
1dydx = +z+
0 xz 2 2
Now for 0 < z 1, FZ (z) is given by:
1 xz z2 1
1 1dydx = +z+
z 0 2 2
So the density of Z is given by:


1+z if 1 z 0



fZ (z) = 1 z if 0 < z 1




0 otherwise

which is known as triangular distribution.


Let W = X/Y , so that W (0, ). We have:
1 min(1,wy) 1
P (W w) = P (X wY ) = 1dxdy = min (1, wy) dy
0 0 0

Consider now the case w 1, then the integral becomes:


1 w
wydy =
0 2
Similarly for w > 1 we get:
1/w 1 1
wydy + 1dy = 1
0 1/w 2w
So the density of W is given by:


if 0 w 1



1/2
fW (w) = 1/2w2 if w > 1




0 otherwise

Problem 8: Wasserman 3.8


Theorem. Let X1 , . . . , Xn be IID and let = E(Xi ), 2 = V(Xi ). Then

2
E(X n ) = , V(X n ) = and E(Sn2 ) = 2 .
n
Proof.

1 n
E(X n ) = E( Xi )
n i=1
1 n
= E(Xi )
n i=1
= .

5
36-705 Intermediate Statistics: Homework 1

1 n
V(X n ) = V( Xi )
n i=1
1 n
= V(Xi )
n2 i=1
2
= .
n

1 n
E(Sn2 ) = E( (Xi X n ) )
2
n 1 i=1
1 n n n
2
= E(Xi ) 2E(X n Xi ) + E( X n )
2
n 1 i=1 i=1 i=1
1 2
= n( 2 + 2 ) 2nE(X n ) + nE(X n )
2
n 1
1
= n( 2 + 2 ) n( 2 /n + 2 )
n 1
n1 2
=
n1
= 2.

Problem 9: Wasserman 3.22


Let X Uniform(0, 1). Let 0 < a < b < 1. Let


1 0 < x < b
Y =


0 otherwise


and let

1 a < x < 1
Z=


0 otherwise
(a) Are Y and Z independent? Why/Why not? Notice

P (Y = 1, Z = 1) = P (X (a, b))
ba
=
10
=ba

and
b0 1a
P (Y = 1)P (Z = 1) =
10 10
= b(1 a),

so Y and Z are not independent.

6
36-705 Intermediate Statistics: Homework 1

(b) Find E(Y Z). Hint: What values z can Z take? Now find E(Y Z = z).

E(Y Z = 1) = 1 P (Y = 1Z = 1) + 0 P (Y = 0Z = 1)
ba
= .
1a

E(Y Z = 0) = 1 P (Y = 1Z = 0) + 0 P (Y = 0Z = 0)
= 1.

Problem 10: Wasserman 3.23


Find the moment generating function for the Poisson, Normal, and Gamma distributions.

Poisson

x (et )x
MX (t) = E[etX ] = etx e = e = e+e
t

x=0 x! x=0 x!
t 1)
MX (t) = e(e .

Gamma
1 x
1 x(t)
MX (t) = E[etX ] = etx x e dx = x e dx.
0 () () 0
Now we use the property that a p.d.f. integrates to 1 to obtain the formula
ba a1 bx
x e dx = 1,
0 (a)

which rearranges to

xa1 ebx dx =
(a)
.
0 ba
So our moment generating function is


1 x(t) ()
MX (t) = x e dx =
() 0 () ( t)

= , t< .
t

Normal

First consider X N (0, 1).


etx x2 1 x2
MX (t) = E[etX ] = etx 2 dx = etx 2 dx
2 2
t2 et /2
2
Let u = x t t
e 2 e 2 (xt) dx = e 2 (xt) dx e 2 e 2 du.
1 2
1 u2
= =
(compl. the sq.) 1 2 1 2

2 2 2

7
36-705 Intermediate Statistics: Homework 1

now the integral we are left with is a famous form the the Gaussian integral and is equal
And
to 2. Therefore,
t2
MX (t) = e 2 .
Now consider a random variable Y with distribution N (, 2 ). Using the Central Limit
Theorem, we have the equation,
Y = + X.
We now compute the desired moment generating function.

MY (t) = E[eY t ] = E[e(+X)t ] = E[et eX ] = et E[eX ]

= et MX (t) = et+ 2
1 2 t2
.

Вам также может понравиться