Вы находитесь на странице: 1из 5

ESE 326 Probability and Statistics for Engineering

Lecture 12
Introduction to random vectors. Joint probability distribuition function

Outcomes of the lecture:


- joint discrete probability density function and its properties;
- joint continuous probability density function and its properties;
- marginal density functions and joint pdf;
- independent random variables;

So far we have considered a single random variable X: discrete or continuous. Now our focus will be on a random vector (X1 , X2 , ..., Xn ) as a
collection of n single random variables where n 1 is an integer. For
simplicity we consider the case of n = 2 though everything said will apply
to the general case. Well use also the notation (X, Y ) instead of (X1 , X2 ).
We start first with a discrete case when X and Y are two drvs.

Definition 1. The function


fXY (x, y) := P (X = x, Y = y)
where x and y run through all possible values of X and Y , respectively,
is called the joint probability density function of (X, Y ), in short - a joint
pdf.
Properties of a discrete joint pdf fXY (x, y):
1) 0 fXY (x, y) 1 for all values (x, y);
P
2) x,y fXY (x, y) = 1;
P
3) fX (x) = y fXY (x, y);
P
4) fY (y) = x fXY (x, y);
1

Properties 1) and 2) are characteristic properties of a joint pdf.


Properties 3) and 4) show how pdfs of single random variables (also called
marginal probability density functions) can be obtained from their joint
pdf.
Indeed, since S = y {Y = y} where y runs through all possible values of
the random variable Y , one can write that :
fX (x) := P (X = x) = P ({X = x} S) = P (X = x, y {Y = y}) =
X
X
fXY (x, y).
P (X = x, Y = y) =
y

Similar for another variable.


Definition 2. Let X and Y be two continuous random variables. Then,
a function fXY (x, y) is said to be the joint pdf for (X, Y ) if for any real
numbers a < b and c < d it holds:
Z bZ d
P (a X b, c Y d) =
fXY (x, y)dydx.
a

Properties of a continuous joint pdf fXY (x, y):


1) fXY (x, y) 0 for all values (x, y);
R R
2) fXY (x, y)dxdy = 1;
R
3) fX (x) = fXY (x, y)dy;
R
4) fY (y) = fXY (x, y)dx;
The next definition logically generalizes the concept of independent events
to the case of random variables.
Definition 3. Two random variables X and Y are said to be independent
if and only if it holds
fXY (x, y) = fX (x)fY (y)
for all possible values of x and y.
If (X, Y ) is a discrete vector, then above definition is equivalent to
P (X = x, Y = y) = P (X = x)P (Y = y)
2

indicating that independence of X and Y is the same to say that X and


Y take their values independently.
Example 1. The joint pdf for a vector (X, Y ) is given by
fXY (x, y) =

2
,1 y x n
n(n + 1)

where n is a positive integer.


a) Verify that fXY (x, y) satisfies the conditions necessary to be a density;
b) Find the marginal densities for X and Y ;
Solution: a) Since (X, Y ) is a discrete random vector, we have to verify
two properties:
i) 0 fXY (x, y) 1 for all possible values which is clear because 0
2
n(n+1) 1 for any fixed integer n;
P
ii) x,y fXY (x, y) = 1. For the last property, we calculate:
X

fXY (x, y) =

x,y

X
x,y

X
2
2
=
1.
n(n + 1) n(n + 1) x,y

P
To find the value of x,y 1 where 1 y x n, we calculate the double
sum as a repeated sum (similar as we calculate the value of a double integral
through integrating first in one variable and then in another) yielding
X

1=

x,y

It implies that
X

n X
x
X

1=

x=1 y=1

fXY (x, y) =

x,y

n
X

x = 1 + 2 + ... + n =

x=1

X
x,y

n(n + 1)
.
2

X
2
2
=
1 = 1.
n(n + 1) n(n + 1) x,y

b) To calculate the mariginal densities, we have to use the definition of a


marginal density plus the argument we used in a) how to calculate a double
sum. For a fixed value of x where x = 1, 2, ..., n, we have that
x

fX (x) =

X
y

X
2
2x
fXY (x, y) =
1=
.
n(n + 1) y=1
n(n + 1)
3

Similarly, for a fixed value of y where y = 1, 2, ..., n, we have that


n
X
X
2
2
fXY (x, y) =
1=
fY (y) =
(n (y 1) =
n(n
+
1)
n(n
+
1)
x
x=y
2
(n y + 1).
n(n + 1)
Example 2. The joint density for (X, Y ) is given by
x3 y 3
fXY (x, y) =
, 0 x 2, 0 y 2.
16
a) Find the marginal densities for X and Y ;
b) Are X and Y independent?;
c) Find P (X 1);
Solution: The problem is similar to Example 1 with the difference that the
vector (X, Y ) here is a continuous one so that where needed we have to
use integration instead of summation.
a) By definition, for any fixed 0 x 2, one has that
Z 2
Z 2
4
x3
3 3
3y 2
fX (x) =
fXY (x, y)dy =
x y /16dy = x
| = .
64 0
4
0
0
By symmetry,
Z 2
y3
fX (y) =
fXY (x, y)dx = , 0 y 2.
4
0
b) By the definition of the independency, it must hold for all possible values
of x and y that
fXY (x, y) = fX (x)fY (y)
which is obviously the case here. Thus, X and Y are independent.
R1
R1 3
4
1
c) P (X 1) = 0 fX (x)dx = 0 x4 dx = x16 |10 = 16
.
Numerical characteristics of a random vector:
First, we state an important observation: For any real-valued function of
two variables h(x, y) it holds:
Z
E(h(X, Y )) =
h(x, y)fXY (x, y)dxdy

in a continuous case, or
E(h(X, Y )) =

h(x, y)fXY (x, y)

xy

in a discrete case.
In particular: If A is any subset from IR2 (not necessary a recktangle of
the form [a, b] [c, d]), then
Z Z
P ((X, Y ) A) =
fXY (x, y)dxdy
A

or
P ((X, Y ) A) =

X
(x,y)A

fXY (x, y).

Вам также может понравиться