Академический Документы
Профессиональный Документы
Культура Документы
X
P (Y = yj ) = P (X = xi , Y = yj ).
i
P
Note that P (X = xi ) 0 and i P (X = xi ) = 1. The mean and variance of
X can be defined in the usual way.
The conditional distribution of X given Y = yj is defined by the probability
function
P (X = xi , Y = yj )
P (X = xi |Y = yj ) = .
P (Y = yj )
The conditional mean of X given Y = yj is defined by
X
E[X|Y = yj ] = xi P (X|Y = yj ).
i
1
CHAPTER 6. MORE THAN ONE VARIABLE 2
where XX
E[XY ] = xi yj P (X = xi , Y = yj ).
i j
The result obtained next provides the range of values of the covariance of two
r.v.s; it is also referred to as a version of the CauchySchwarz inequality.
1. Consider the r.v.s X and Y with E[X] = E[Y ] = 0 and V ar[X] = V ar[Y ] =
1. Then always 1 E[XY ] 1, and E[XY ] = 1 if and only if P (X =
Y ) = 1, and E[XY ] = 1 if and only if P (X = Y ) = 1.
2
2. For any r.v.s X and Y with finite expectations and positive variances X and
2
Y , it always holds: X Y Cov(X, Y ) X Y , and Cov(X, Y ) = X Y
if and only if P [Y = E[Y ] + XY (X E[X])] = 1, Cov(X, Y ) = X Y if and
only if P [Y = E[Y ] XY (X EX)] = 1.
Example 6.1
Let X and Y be two r.v.s with finite expectations and equal (finite) variances,
and set U = X + Y and V = X Y . Calculate if r.v.s U and V are correlated.
Solution
For two r.v.s X and Y with finite expectations, and (positive) standard devia-
tions X and Y , it holds:
CHAPTER 6. MORE THAN ONE VARIABLE 3
2
V ar(X + Y ) = X + Y2 + 2Cov(X, Y )
and
2
V ar(X + Y ) = X + Y2
P (X = xi , Y = yj ) = P (X = xi )P (Y = yj )
.
If X and Y are independent then Cov[X, Y ] = 0. The converse is NOT true.
There exist many pairs of random variables with Cov[X, Y ] = 0 which are not
independent.
Example 6.2
A fair dice is thrown three times. The result of first throw is scored as X1 = 1
if the dice shows 5 or 6 and X1 = 0 otherwise; X2 and X3 are scored likewise for
the second and third throws.
Let Y1 = X1 + X2 and Y2 = X1 X3 .
4
Show that P (Y1 = 0, Y2 = 1) = 27 . Calculate the remaining probabilities in
the bivariate distribution of the pair (Y1 , Y2 ) and display the joint probabilities in
an appropriate table.
Solution
Y1 P
0 1 2
4 2 6
-1 27 27
0 27
8 6 1 15
Y2 0 27 27 27 27
4 2 6
1 0 27 27 27
12 12 3
P
27 27 27
1
y1 0 1 2
12 12 3
P (Y1 = y1 ) 27 27 27
Marginal probability distribution of Y2 :
y2 -1 0 -1
6 15 6
P (Y2 = y2 ) 27 27 27
2.
12 12 3 2
E[Y1 ] = 0 +1 +2 =
27 27 27 3
12 12 3 8
E[Y12 ] = 02 + 12 + 22 =
27 27 27 9
2
8 2 4
V ar[Y1 ] = E[Y12 ] (E[Y1 ])2 = =
9 3 9
6 15 6
E[Y2 ] = 1 +0 +1 =0
27 27 27
6 15 6 4
E[Y22 ] = (1)2 + 02 + 12 =
27 27 27 9
2 2 4
V ar[Y2 ] = E[Y2 ] (E[Y2 ]) =
9
2 4
3. Cov[Y1 , Y2 ] = E[Y1 Y2 ] E[Y1 ]E[Y2 ] = E[Y1 Y2 ] = 1 (1) 27 + 1 1 27 +
2 2
2 1 27 = 9
CHAPTER 6. MORE THAN ONE VARIABLE 5
4.
8
P (Y1 = 0 Y2 = 0) 27 8
P (Y1 = 0|Y2 = 0) = = 15 = ,
P (Y2 = 0) 27
15
6
P (Y1 = 1 Y2 = 0) 27 6
P (Y1 = 1|Y2 = 0) = = 15 =
P (Y2 = 0) 27
15
1
P (Y1 = 2 Y2 = 0) 27 1
P (Y1 = 2|Y2 = 0) = = 15 =
P (Y2 = 0) 27
15
5.
6 1 8
E[Y1 |Y2 = 0] = 1 +2 =
15 15 15
Exercises
Exercise 6.1
Solution
X
-2 -1 0 1 2
1 2c 0 0 2c 6c 10c
Y 2 6c c 0 3c 10c 20c
3 10c 2c 0 4c 14c 30c
18c 3c 0 9c 30c 60c
1
Since the sum of probabilities must add to one, c =
60
.
39
P (X > 0) = 60
1
P (X + Y = 0) = P (X = 2, Y = 2) + P (X = 1, Y = 1) = 10
x -2 -1 0 1 2
P (X = x) 18/60 3/60 0 9/60 30/60
Marginal distributions for Y :
y 1 2 3
P (Y = y) 10/60 20/60 30/60
E[X] = 2 18/60 1 3/60 + 0 0 + 1 9/60 + 2 30/60 = 30/60 = 1/2
E[X 2 ] = (2)2 18/60 + (1)2 3/60 + 0 0 + 12 9/60 + 22 30/60 = 3.4
V ar[X] = E[X 2 ] E[X]2 = 3.4 0.52 = 3.15
E[Y ] = 1 1/6 + 2 1/3 + 3 1/2 = 14/6 = 7/3
E[Y 2 ] = 12 1/6 + 22 1/3 + 32 1/2 = 36/6 = 6.0
V ar[Y ] = 6.0 (7/3)2 = 5/9
Z =X +Y
z -1 0 1 2 3 4 5
P (Z = z) 2/60 6/60 11/60 4/60 9/60 14/60 14/60
1
E[Z] = 60 1 2 + 1 11 + 2 4 + 3 9 + 4 14 + 5 14) = 170
60
=
5 1 1
= 2 6 = 2 + 2 3 = E[X] + E[Y ]
Exercise 6.2
The following experiment is carried out. Three fair coins are tossed. Any coins
showing heads are removed and the remaining coins are tossed. Let X be the
number of heads on the first toss and Y the number of heads on the second toss.
Note that if X = 3 then Y = 0. Find the joint probability function and marginal
distributions of X and Y .
Solution
X
0 1 2 3
0 1/64 6/64 12/64 8/64 27/64
Y 1 5/64 12/64 12/64 0 27/64
2 3/64 6/64 0 0 9/64
3 1/64 0 0 0 1/64
1/8 3/8 3/8 1/8 1
Marginal distribution for X:
x 0 1 2 3
P (X = x) 1/8 3/8 3/8 1/8
Marginal distribution for Y :
y 0 1 2 3
P (Y = y) 27/64 27/64 9/64 1/64