Вы находитесь на странице: 1из 32

JOINT PROBABILITY

DISTRIBUTIONS
MODULE 4
Learning Objectives
After careful study of this chapter you should be able to do the following:

1. Use joint probability mass function and joint probability density


functions to calculate probabilities

2. Calculate marginal and conditional probabilities from joint probability


distributions

3. Interpret and calculate covariances and correlations between random


variables

4. Calculate means and variances for linear combinations of random


variables and calculate probabilities for linear combinations of normally
distributed random variables.

5. Determine the distribution of a general function of a random variable


Joint Probability Mass Function for
Discrete Random Variables

Ê The joint probability mass function of the discrete random


variables X and Y, denoted as fXY(x, y), satisfies

1) fXY(x, y) ≥ 0

2) åå f XY ( x, y ) = 1
x y

3) f XY ( x, y ) = P( X = x, Y = y )
EXAMPLE 1

• Two refills for a ballpoint pen are selected at random from a


box that contains three blue refills, two red refills, and three
green refills. If X is the number of blue refills and Y is the
number of red refills selected, find (1) the joint probability
function f(x, y), and
(2) P[( X , Y )Î A], where A is the region {(x, y ) x + y £ 1}.
Joint Probability Density Function for
Continuous Random Variables

Ê A joint probability density function for the continuous


random variables X and Y, denoted as fXY(x, y), satisfies the
following properties

(1) f XY ( x, y ) ³ 0 for all x, y


¥ ¥
(2) ò ò f ( x, y )
- ¥- ¥
XY

(3) For any region R of two - dimensional space,


P(( X , Y ) Î R ) = ò ò f XY ( x, y )dxdy
R
EXAMPLE 2

• Consider the joint density function


x(1 + 3 y 2 )
f ( x, y ) = , 0 < x < 2, 0 < y < 1
4
= 0, elsewhere.

1. Verify condition 2 of the properties of the continuous joint


probability distributions.
2. Find

P[( X , Y ) Î A], where A is the region {( x, y ) 0 < x < 1, 1 / 4 < y < 1 / 2}.
Marginal Probability Density Function
If the joint probability density function of random variables X and Y is
fXY(x, y), the marginal probability density functions of X andY are

f X ( x ) = ò f XY ( x, y )dy and fY ( y ) = ò f XY ( x, y )dx


y x

where the first integral is over all points in the range of (X, Y)
for which X = x and the second integral is over all points in the
range of (X, Y) for Y = y.
Conditional Probability Distribution of
Discrete Random Variable

f ( x, y )
f (y x) = , g ( x) > 0
g ( x)
f ( x, y )
f (x y ) = , h ( y ) > 0.
h( y )
where
g ( x ) = å f ( x, y )
y

h ( y ) = å f ( x, y )
x
Conditional Probability Distribution
of Continuous Random Variable
f ( x, y )
f (y x) = , g ( x) > 0
g ( x)
f ( x, y )
f (x y ) = , h( y ) > 0
h( y )
where
¥
g ( x) = ò f ( x, y )dy

¥
h( y ) = ò f ( x, y )dx

Test for Independence

Ê Let X and Y be two random variables, discrete or continuous,


with joint probability distribution f(x,y) and marginal
distributions g(x) and h(y), respectively. The random variables
X andY are said to be independent if and only if

Ê f(x,y) =g(x)h(y) for all (x,y)


Mutually Statistically Independent

Ê Let X1 , X2 , ……., Xn be n random variables, discrete or


continuous, with joint probability distributions f(x1, x2, ……, xn
)and marginal distributions f1 (x1), f2 (x2), ……., fn (xn),
respectively. The random variables X1, X2, ……, Xn are said to
be mutually statistically independent if and only if

f (x1 , x2 ,", xn ) = f1 ( x1 ) f 2 ( x2 ) ! f n ( xn )
Example 3
Ê Suppose that X and Y have the following joint probability
distribution:

y x

Ê Evaluate the marginal and conditional probability


distributions and test for independence.
Example 4

Ê The joint density function of the random variables X and Y is


given by

f(x,y) = 8xy, 0 < x < 1, 0 < y < x

= 0, elsewhere.

Find g(x), h(y), f(yǀx), P(Y < 1/8 ǀ X = ½)


Example 5

Ê Let X1, X2, and X3 be three mutually statistically independent


random variables and let each have probability density
function
Conditional Mean and Variance

Ê The conditional mean of Y given X = x, denoted as E(Y|x) or


μY|x is

E (Y x ) = ò yfY x ( y )
y

Ê and the conditional variance of Y given X = x, denoted as


V(Y|x) or σ2Y|x, is

(
V (Y x ) = ò y - µY x )f
2
Y x
(y) = ò y 2
fY x ( y ) - µ 2
Y x
y y
Example 6

Ê Let the random variable X denote the time until a computer server
connects to your machine (in milliseconds) and let Y denote the
time until the server authorizes you as a valid user (in milliseconds).
Each random variables measures the wait from a common starting
time and X < Y. Assume that the joint probability density function
for X and Y is

fXY(x, y) = 6 x 10-6exp(-0.001x – 0.002y) for x<y

Ê Determine the conditional mean for Y given that x = 5000.


Expected Value of a Function of Two
Random Variables

ìïåå h( x, y ) f XY ( x, y ) X, Y discrete
E [h( X , Y )] = í
ïîò ò h( x, y ) f XY ( x, y )dxdy X, Y continuous
Covariance

Ê The covariance between random variables X and Y, denoted


as cov(X, Y) or σXY, is

s XY = E[( X - µ X )(Y - µY )] = E ( XY ) - µ X µY
Correlation

Ê The correlation between random variables X and Y, denoted


as ρXY, is

cov( X , Y ) s XY
r XY = =
V ( X )V (Y ) s X s Y

For any two random variables X and Y, - 1 ≤ ρXY ≤ +1


If X and Y are independent random variables, σXY = ρXY = 0
Example 7

Ê Given the following joint probability distribution of no. of


refills where X is the no. of blue refills and Y = no. of red refills

0 1 2
y 0x 3/28 9/28 3/28
1 3/14 3/14
2 1/28

Ê Calculate the Expected Value, Covariance and Correlation of


X and Y.
Example 8

Ê Let the random variables X and Y have the joint probability


distribution

f(x, y) = 2, 0 < x < y, 0 <y < 1

= 0, elsewhere

Ê Find σXY.
Linear Functions of Random Variables

Ê Given random variables X1, X2, ……, Xp and constants c1, c2,
……., cp,

Y = c1X1 + c2X2 + … + cpXp


Ê is a linear combination of X1, X2, ……, Xp.
Mean of a Linear Function

Ê If Y = c1X1 + c2X2 + … + cpXp,

E(Y) = c1E(X1) + c2E(X2) + … + cp E(Xp),


Variance of a Linear Function

Ê If X1, X2, ……, Xp are random variables and Y = c1X1 + c2X2 + … +


cpXp, then in general,

V (Y ) = c12V ( X 1 ) + c22V ( X 2 ) + !V (Y+) =c 2pV ( X p ) + 2åå ci c j cov(X i , X j )

Ê If X1, X2, ……, Xp are independent,

V (Y ) = c V ( X 1 ) + c V ( X 2 ) + ! + c V ( X p )
2
1
2
2
2
p
Example 9

Ê A semiconductor product consists of three layers. If the


variances in thickness of the first, second, and third layers are
15, 40, and 30 nanometers squared, what is the variance of
the thickness of the final product?
Mean and Variance of an Average

If X = ( X 1 + X 2 + " + X P ) / p with E ( X i ) = µ for i = 1, 2, ! , p,


( )
E X =µ
If X 1 , X 2 , ! , X p are independent with V ( X i ) = s 2 for i = 1, 2, ! , p,

( )
V X =
s2
p
Reproductive Property of the Normal
Distribution

Ê If X1, X2,….,Xp are independent, normal random variables with


E(Xi) = μi and V(Xi) = σi2, for i = 1, 2, ……, p,

Y = c1X1 + c2X2 + … + cpXp

and

V (Y ) = c12s 12 + c22s 22 + ! + c 2ps p2


Example 10

Ê Let the random variables X1 and X2 denote the length and


width, respectively, of a manufactured part. Assume that X1 is
normal with E(X1) = 2 centimeters and standard deviation 0.1
centimeter, and that X2 is normal with E(X2) = 5 centimeters
and standard deviation 0.2 centimeter. Also, assume that X1
and X2 are independent. Determine the probability that the
perimeter exceeds 14.5 centimeters.
General Function of a Discrete Random
Variable

Ê Suppose that X is a discrete random variable with probability


distribution fX(x). Let Y = h(X) define a one-to-one
transformation between the values of X and Y so that the
equation y = h(x) can be solved uniquely for x in terms of y.
Let this solution be x = u(y). Then the probability mass
function of the random variable Y is

fY(y) =fX[u(y)]
Example 11

Ê Let X be a geometric random variable with probability


distribution

fX(x)= p(1 – p)x-1, x = 1, 2, …….

Ê Find the probability distribution of Y = X2.


General Function of a Continuous
Random Variable

Ê Suppose that X is a continuous random variable with


probability distribution fX(x). The function Y = h(X) is a one-to-
one transformation between the values of Y and X so that the
equation y = h(x) can uniquely solved for x in terms of y. Let
this solution be x = u(y). The probability distribution of Y is

fY(y) = fX[u(y)] |J|

Ê where J = u’(y) is called the Jacobian of the transformation


and the absolute value of J is used.
Example 12

Ê Let X be a continuous random variable with probability


distribution
x
f X (x ) = , 0 £ x £ 4
8
Ê Find the probability distribution of Y = h(x) = 2X + 4.

Вам также может понравиться