Академический Документы
Профессиональный Документы
Культура Документы
September 1, 2016
Proof.
" Suppose belongs to an infinite number of the events A1 , A2 , . . . but
n=1 i=n Ai .
Then there exists a j such that i=j Ai . But this implies can be in at most j 1 of the
Ai , a contradiction. Therefore,
n=1 i=n Ai .
" Assume
n=1 i=n Ai . If does not belong to an infinite number of the events
A1 , A2 , . . . , then pick the last (according to the index) Ai which contains and call it AT .
Then Ak , k > T . Hence, n=1 Bn , a contradiction. Therefore, belongs to an infinite
number of the events A1 , A2 , . . . .
(c) Show that n=1 Cn if and only if belongs to all the events A1 , A2 , . . . except possibly
a finite number of those events.
1
36-705 Intermediate Statistics: Homework 1
Proof.
" Suppose that belongs to all the events A1 , A2 , . . . except possibly a finite number of
those events. Pick the last Ai that is not in and call the next one An0 . An , n n0 so
clearly
i=n0 Ai . Since this set is part of the union n=1 Cn , weve shown n=1 Cn .
" Suppose
n=1 i=n Ai . Suppose further that no matter how high we run our index,
we can always find an Ai such that Ai . More precisely, for any k N, j > k such that
Aj . This of course implies
i=n Ai for any n. That is, n=1 i=n Ai , a contra-
diction. Therefore, belongs to all the events A1 , A2 , . . . except possibly a finite number of
those events.
P Ai = 1.
i=1
P Aci = 0.
i=1
P Aci P(Aci ) = 0 = 0.
i=1 i=1 i=1
A = { HH T, n 1}
n times
B = { T T H, n 1}
n times
Then = A B.
2
36-705 Intermediate Statistics: Homework 1
1 1
P (three tosses) = 2 = .
23 4
And these observations together with the fact that P(X = x) = P(X x) P(X < x) imply
the desired result.
1/4 0 < x < 1
fX (x) = 3/8 3 < x < 5
0 otherwise.
x<0
0
0x<1
x/4
FX (x) = 1/4 1x3 .
3x7
3<x5
8
1 x>5
(b) Let Y = 1/X. Find the probability density function fY (y) for Y . Hint: Consider three
cases: 51 y 13 , 31 y 1, and y 1.
3
36-705 Intermediate Statistics: Homework 1
Following the hint, let us first consider the case when y = 1/x 1, which occurs exactly when
0 < x 1. Keep in mind X takes values in the interval (, 0] with probability 0. Now in
this interval we have
1
fX (x) = 0<x<1
4
and Y = g(X) = 1/X is indeed monotonic (therefore h(y) = g 1 = 1/y) so we can directly
employ the transformation formula for p.d.f.s. That is,
RRR R R R
dh(y) RRRR 1 RRRR 1 RRRR
fY (y) = fX (h(y))RRRRR
1
R= R R= , y 1.
RRR dy RRRRR 4 RRRR y 2 RRRR 4y 2
R R
Using the same method, we first note that 51 y 13 corresponds exactly with 3 x 5 (we
can drop the endpoints since were dealing with continuous probability here). For this interval
we have
3
fX (x) = 1 < x < 3,
8
RRR R R R
and therefore
R
R dh(y) RRRR 3 RRRR 1 RRRR 3 1 1
fY (y) = fX (h(y))RRR RRR = RRR 2 RRR = 2 , <y< .
RRR dy RRR 8 RRR y RRR 8y 5 3
2 2z 0<z<1
fZ (z) = FZ (z) =
0 otherwise.
4
36-705 Intermediate Statistics: Homework 1
1+z if 1 z 0
fZ (z) = 1 z if 0 < z 1
0 otherwise
if 0 w 1
1/2
fW (w) = 1/2w2 if w > 1
0 otherwise
2
E(X n ) = , V(X n ) = and E(Sn2 ) = 2 .
n
Proof.
1 n
E(X n ) = E( Xi )
n i=1
1 n
= E(Xi )
n i=1
= .
5
36-705 Intermediate Statistics: Homework 1
1 n
V(X n ) = V( Xi )
n i=1
1 n
= V(Xi )
n2 i=1
2
= .
n
1 n
E(Sn2 ) = E( (Xi X n ) )
2
n 1 i=1
1 n n n
2
= E(Xi ) 2E(X n Xi ) + E( X n )
2
n 1 i=1 i=1 i=1
1 2
= n( 2 + 2 ) 2nE(X n ) + nE(X n )
2
n 1
1
= n( 2 + 2 ) n( 2 /n + 2 )
n 1
n1 2
=
n1
= 2.
and let
1 a < x < 1
Z=
0 otherwise
(a) Are Y and Z independent? Why/Why not? Notice
P (Y = 1, Z = 1) = P (X (a, b))
ba
=
10
=ba
and
b0 1a
P (Y = 1)P (Z = 1) =
10 10
= b(1 a),
6
36-705 Intermediate Statistics: Homework 1
(b) Find E(Y Z). Hint: What values z can Z take? Now find E(Y Z = z).
E(Y Z = 1) = 1 P (Y = 1Z = 1) + 0 P (Y = 0Z = 1)
ba
= .
1a
E(Y Z = 0) = 1 P (Y = 1Z = 0) + 0 P (Y = 0Z = 0)
= 1.
Poisson
x (et )x
MX (t) = E[etX ] = etx e = e = e+e
t
x=0 x! x=0 x!
t 1)
MX (t) = e(e .
Gamma
1 x
1 x(t)
MX (t) = E[etX ] = etx x e dx = x e dx.
0 () () 0
Now we use the property that a p.d.f. integrates to 1 to obtain the formula
ba a1 bx
x e dx = 1,
0 (a)
which rearranges to
xa1 ebx dx =
(a)
.
0 ba
So our moment generating function is
1 x(t) ()
MX (t) = x e dx =
() 0 () ( t)
= , t< .
t
Normal
2 2 2
7
36-705 Intermediate Statistics: Homework 1
now the integral we are left with is a famous form the the Gaussian integral and is equal
And
to 2. Therefore,
t2
MX (t) = e 2 .
Now consider a random variable Y with distribution N (, 2 ). Using the Central Limit
Theorem, we have the equation,
Y = + X.
We now compute the desired moment generating function.
= et MX (t) = et+ 2
1 2 t2
.