Вы находитесь на странице: 1из 8

Carnegie Mellon University Department of Statistics

36-705 Intermediate Statistics Homework #1


Solutions
September 1, 2016
Problem 1: Wasserman 1.3

Let be a sample space and let A1 , A2 , . . . be events. Define Bn =


i=n Ai and Cn = i=n Ai .

(a) Show that B1 B2 and that C1 C2 .


Proof. (By Induction)
Base.

i=1

i=2

B1 = Ai = A1 Ai = A1 B2 B1 B2
Now for any k N,

i=k

i=k+1

Bk = Ai = Ak Ai = Ak Bk+1 Bk Bk+1 .
Proof. (By Induction)
Base.

i=1

i=2

C1 = Ai = A1 Ai = A1 C2 C1 C2
Now for any k N,

i=k

i=k+1

Ck = Ai = Ak Ai = Ak Ck+1 Ck Ck+1 .

(b) Show that


n=1 Bn if and only if belongs to an infinite number of the events
A1 , A 2 , . . . .
Proof.

" Suppose belongs to an infinite number of the events A1 , A2 , . . . but


n=1 i=n Ai .
Then there exists a j such that
i=j Ai . But this implies can be in at most j 1 of the

Ai , a contradiction. Therefore,
n=1 i=n Ai .

" Assume
n=1 i=n Ai . If does not belong to an infinite number of the events
A1 , A2 , . . . , then pick the last (according to the index) Ai which contains and call it AT .
Then Ak , k > T . Hence,
n=1 Bn , a contradiction. Therefore, belongs to an infinite
number of the events A1 , A2 , . . . .

(c) Show that


n=1 Cn if and only if belongs to all the events A1 , A2 , . . . except possibly
a finite number of those events.

36-705 Intermediate Statistics: Homework 1


Proof.
" Suppose that belongs to all the events A1 , A2 , . . . except possibly a finite number of
those events. Pick the last Ai that is not in and call the next one An0 . An , n n0 so

clearly
i=n0 Ai . Since this set is part of the union n=1 Cn , weve shown n=1 Cn .

" Suppose
n=1 i=n Ai . Suppose further that no matter how high we run our index,
we can always find an Ai such that Ai . More precisely, for any k N, j > k such that

Aj . This of course implies


i=n Ai for any n. That is, n=1 i=n Ai , a contradiction. Therefore, belongs to all the events A1 , A2 , . . . except possibly a finite number of
those events.

Problem 2: Wasserman 1.8


Suppose that P(Ai ) = 1 for each i. Prove that

P Ai = 1.
i=1
Proof. We will prove the equivalent proposition:

P Aci = 0.
i=1
In order to show this note that:


P Aci P(Aci ) = 0 = 0.
i=1 i=1
i=1

Problem 3: Wasserman 1.13


Suppose that a fair coin is tossed repeatedly until both a head and tail have appeared at least
once.
(a) Describe the sample space .
We can partition the sample space into two subsets
A = { HH T, n 1}

n times

B = { T T H, n 1}

n times

Then = A B.
2

36-705 Intermediate Statistics: Homework 1


(b) What is the probability that three tosses will be required?
This can occur in two ways: HHT or TTH. Each has probability ( 12 ) so
3

P (three tosses) = 2

1 1
= .
23 4

Problem 4: Wasserman 2.1


Show that

P(X = x) = F (x+ ) F (x ).

We have by right continuity of the CDF F (x+ ) = F (x) = P(X x). Also by continuity of
probabilities
F (x ) = lim F (x 1/n) = lim P(X x 1/n) = P(X < x).
n

And these observations together with the fact that P(X = x) = P(X x) P(X < x) imply
the desired result.

Problem 5: Wasserman 2.4


Let X have probability density function

1/4 0 < x < 1

fX (x) = 3/8 3 < x < 5

otherwise.
0
(a) find the cumulative distribution function of X.

x/4

FX (x) = 1/4

3x7

x<0
0x<1
1x3 .
3<x5
x>5

(b) Let Y = 1/X. Find the probability density function fY (y) for Y . Hint: Consider three
cases: 51 y 13 , 31 y 1, and y 1.

36-705 Intermediate Statistics: Homework 1


Following the hint, let us first consider the case when y = 1/x 1, which occurs exactly when
0 < x 1. Keep in mind X takes values in the interval (, 0] with probability 0. Now in
this interval we have
1
fX (x) =
0<x<1
4
and Y = g(X) = 1/X is indeed monotonic (therefore h(y) = g 1 = 1/y) so we can directly
employ the transformation formula for p.d.f.s. That is,
RRR
R
dh(y) RRRR
fY (y) = fX (h(y))RRRRR
R=
RRR dy RRRRR

R R
1 RRRR 1 RRRR
1
R R=
, y 1.
4 RRRR y 2 RRRR 4y 2
R R

Using the same method, we first note that 51 y 13 corresponds exactly with 3 x 5 (we
can drop the endpoints since were dealing with continuous probability here). For this interval
we have
3
fX (x) =
1 < x < 3,
8
and therefore
RRR
R
R R
dh(y) RRRR 3 RRRR 1 RRRR
3
1
1
R
R
RRR = RRR 2 RRR = 2 ,
fY (y) = fX (h(y))RRR
<y< .
5
3
RRR dy RRR 8 RRR y RRR 8y
Now for the only remaining interval,
to the interval 1 < x < 3 and that
Hence,

1
3

< y < 1, we need only note that this corresponds exactly


P (X (1, 3)) = 0.

P (Y (1/3, 1)) = 0.

So altogether,

8y 2

1
fY (y) = 4y2

1
5

<y<

y>1

1
3

otherwise.

Problem 6: Wasserman 2.7


Let X and Y be independent and suppose that each has a Uniform(0, 1) distribution. Let
Z = min{X, Y }. Find the density fZ (z) for Z. Hint: It might be easier to first find P(Z > z).
FZ (z) = P(Z z) = 1 P(Z > z) = 1 P(X > z)P(Y > z) = 1 (1 z)2 = 2z z 2 .
fZ (z) =

FZ (z)

2 2z
=

0<z<1
otherwise.

36-705 Intermediate Statistics: Homework 1


Problem 7: Wasserman 2.20
Let Z = X Y . We can easily see that Z [1, 1]. To calculate FZ (z) we first consider the
case 1 z 0:
1+z
1
z2
1
1dydx
=
+z+

2
2
0
xz
Now for 0 < z 1, FZ (z) is given by:
1

xz

1dydx =

z2
1
+z+
2
2

So the density of Z is given by:

1+z

fZ (z) = 1 z

if 1 z 0
if 0 < z 1
otherwise

which is known as triangular distribution.


Let W = X/Y , so that W (0, ). We have:
P (W w) = P (X wY ) =

1
0

min(1,wy)

1dxdy =

1
0

min (1, wy) dy

Consider now the case w 1, then the integral becomes:

wydy =

w
2

Similarly for w > 1 we get:

1/w
0

wydy +

1
1/w

1dy = 1

1
2w

So the density of W is given by:

fW

1/2

(w) = 1/2w2

if 0 w 1
if w > 1
otherwise

Problem 8: Wasserman 3.8


Theorem. Let X1 , . . . , Xn be IID and let = E(Xi ), 2 = V(Xi ). Then
E(X n ) = , V(X n ) =

2
and E(Sn2 ) = 2 .
n

Proof.
1 n
E(X n ) = E( Xi )
n i=1
=

1 n
E(Xi )
n i=1

= .

36-705 Intermediate Statistics: Homework 1

1 n
V(X n ) = V( Xi )
n i=1

E(Sn2 ) = E(

1 n
V(Xi )
n2 i=1

2
.
n

1 n
2
(Xi X n ) )
n 1 i=1

n
n
1 n
2
2
E(Xi ) 2E(X n Xi ) + E( X n )
n 1 i=1

i=1
i=1

1
2
2
n( 2 + 2 ) 2nE(X n ) + nE(X n )
n 1

1
n( 2 + 2 ) n( 2 /n + 2 )
n 1

n1 2

n1
= 2.

Problem 9: Wasserman 3.22


Let X Uniform(0, 1). Let 0 < a < b < 1. Let

1 0 < x < b
Y =

0 otherwise

1 a < x < 1
Z=

0 otherwise

and let

(a) Are Y and Z independent? Why/Why not? Notice


P (Y = 1, Z = 1) = P (X (a, b))
ba
=
10
=ba
and
b0 1a

10 10
= b(1 a),

P (Y = 1)P (Z = 1) =

so Y and Z are not independent.

36-705 Intermediate Statistics: Homework 1


(b) Find E(Y Z). Hint: What values z can Z take? Now find E(Y Z = z).
E(Y Z = 1) = 1 P (Y = 1Z = 1) + 0 P (Y = 0Z = 1)
ba
=
.
1a

E(Y Z = 0) = 1 P (Y = 1Z = 0) + 0 P (Y = 0Z = 0)
= 1.

Problem 10: Wasserman 3.23


Find the moment generating function for the Poisson, Normal, and Gamma distributions.
Poisson

MX (t) = E[etX ] = etx


x=0

t
x
(et )x
e = e
= e+e
x!
x!
x=0
t 1)

MX (t) = e(e

Gamma
MX (t) = E[etX ] =

etx

1 x
1 x(t)
x e dx =
dx.
x e
()
() 0

Now we use the property that a p.d.f. integrates to 1 to obtain the formula

ba a1 bx
x e dx = 1,
(a)

which rearranges to

xa1 ebx dx =

(a)
.
ba

So our moment generating function is


MX (t) =

()

1 x(t)

dx =
x e
() 0
() ( t)


, t< .
=
t

Normal
First consider X N (0, 1).
MX (t) = E[etX ] =

(compl. the sq.)

x2
x2
etx
1
etx 2 dx = etx 2 dx
2
2

2
t2

1
1
u2
2
2
1
et /2
1
Let u = x t t
e 2 e 2 (xt) dx = e 2 (xt) dx
=
e 2 e 2 du.
2
2
2
2

36-705 Intermediate Statistics: Homework 1


And
now the integral we are left with is a famous form the the Gaussian integral and is equal
to 2. Therefore,
t2

MX (t) = e 2 .
Now consider a random variable Y with distribution N (, 2 ). Using the Central Limit
Theorem, we have the equation,
Y = + X.
We now compute the desired moment generating function.
MY (t) = E[eY t ] = E[e(+X)t ] = E[et eX ] = et E[eX ]
= et MX (t) = et+ 2
1

2 t2

Вам также может понравиться