Вы находитесь на странице: 1из 6

1

Probability Density Functions (PDF)


b

For a continuous RV X with PDF fX (x),

6.041/6.431 Probabilistic Systems Analysis


Quiz II Review Fall 2010 Properties:

P ( a X b) P (X A)

=
a

fX (x)dx fX (x)dx
A

Nonnegativity: fX (x) 0 x Normalization:


fX (x)dx = 1

3 2 PDF Interpretation

Mean and variance of a continuous RV

E [X ] = Caution: fX (x) = P (X = x) if X is continuous, P (X = x) = 0 x!! fX (x) can be 1 Interpretation: probability per unit length for small lengths around x P (x X x + ) fX (x) Var(X ) = = = E [g (X )] = E [aX + b] Var(aX + b) = =

xfX (x)dx
2

E (X E [X ])

(x E [X ])2 fX (x)dx

E [X 2 ] (E [X ])2 ( 0)

g (x)fX (x)dx

aE [X ] + b a2 Var(X )

Cumulative Distribution Functions


FX (x) = P (X x)

Uniform Random Variable


Denition: monotonically increasing from 0 (at ) to 1 (at +). Continuous RV (CDF is continuous in x): FX (x) = P (X x) = fX (x) =
x

If X is a uniform random variable over the interval [a,b]:


1 ba

fX (x) = fX (t)dt FX (x) =

if a x b otherwise if x a if a x b otherwise (x > b)

dFX (x) dx Discrete RV (CDF is piecewise constant): FX (x) = P (X x) =


k x

x a ba

p X (k )

pX (k ) = FX (k ) FX (k 1)
5

ba 2 ( b a) 2 var(X ) = 12 E [X ] =
6

Exponential Random Variable 7 Normal/Gaussian Random Variables


2 2 1 e(x) /2 2 , Var(X ) = 2

X is an exponential random variable with parameter : ex if x 0 fX (x) = 0 otherwise 1 ex if x 0 FX (x) = 0 otherwise 1 1 E [X ] = var(X ) = 2 Memoryless Property: Given that X > t, X t is an exponential RV with parameter

General normal RV: N (, 2 ): fX (x) =

E [X ] =

Property: If X N (, 2 ) and Y = aX + b then Y N (a + b, a2 2 )

Normal CDF 9 Joint PDF

Standard Normal RV: N (0, 1) CDF of standard normal RV Y at y: (y ) - given in tables for y 0 - for y < 0, use the result: (y ) = 1 (y ) To evaluate CDF of a general standard normal, express it as a function of a standard normal: X N (, 2 ) P (X x) = P X N (0, 1)

Joint PDF of two continuous RV X and Y : fX,Y (x, y ) P (A) =


A

fX,Y (x, y )dxdy

x x X =

Marginal pdf: fX (x) = fX,Y (x, y )dy E [g (X, Y )] = g (x, y )fX,Y (x, y )dxdy Joint CDF: FX,Y (x, y ) = P (X x, Y y )

10

11 10 Independence

Conditioning on an event

Let X be a continuous RV and A be an event with P (A) > 0,

By denition, X, Y independent fX,Y (x, y ) = fX (x)fY (y ) (x, y ) If X and Y are independent: E [XY ]=E [X ]E [Y ] g (X ) and h(Y ) are independent E [g (X )h(Y )] = E [g (X )]E [h(Y )] E [g (X )|A] = P (X B |X A) = fX |A (x) =

fX (x) P (X A )

if x A otherwise

E [X |A] =

fX |A (x)dx xfX |A (x)dx g (x)fX |A (x)dx

11

12

12
If A1 , . . . , An are disjoint events that form a partition of the sample space,
n

Conditioning on a RV
fX,Y (x, y ) fY (y )

X, Y continuous RV fX |Y (x|y ) fX (x) = =

fX (x)

=
i=1 n

P (Ai )fX |Ai (x) ( total probability theorem) P (Ai )E [X |Ai ] (total expectation theorem)
i=1 n

fY (y )fX |Y (x|y )dy ( totalprobthm)

E [X ] = E [g (X )] =
i=1

Conditional Expectation: E [X |Y = y ] E [ g (X )| Y = y ] E [g (X, Y )|Y = y ]

P (Ai )E [g (X )|Ai ]

= = =

xfX |Y (x|y )dx g (X )fX |Y (x|y )dx g (x, y )fX |Y (x|y )dx

13

14

13
Total Expectation Theorem:

Continuous Bayes Rule


fY |X (y |x)fX (x) = fY (y ) fY |X (y |x)fX (x) f (y |t)fX (t)dt Y |X

X, Y continuous RV, N discrete RV, A an event. E [X |Y = y ]fY (y )dy E [g (X )|Y = y ]fY (y )dy P (A|Y = y ) E [g (X, Y )|Y = y ]fY (y )dy = fX |Y (x|y ) =

E [X ] = E [g (X )] = E [g (X, Y )] =

P (A)fY |A (y ) P (A)fY |A (y ) = fY (y ) fY |A (y )P (A) + fY |Ac (y )P (Ac ) pN (n)fY |N (y |n) = fY (y ) pN (n)fY |N (y |n) i pN (i)fY |N (y |i)

P (N = n|Y = y ) =

15

16

14

Derived distributions

15

Convolution

Def: PDF of a function of a RV X with known PDF: Y = g (X ). Method: Get the CDF: F Y (y ) = P (Y y ) = P (g (X ) y ) = Dierentiate: fY (y ) =
dFY dy x| g ( x) y

W = X + Y , with X, Y independent. Discrete case: p W (w ) = pX (x)pY (w x)


x

fX (x)dx Continuous case:

(y )
1 x b |a| fX ( a )

Special case: if Y = g (X ) = aX + b, fY (y ) =

fW (w) =

fX (x)fY (w x) dx

17

18

Graphical Method: put the PMFs (or PDFs) on top of each other ip the PMF (or PDF) of Y shift the ipped PMF (or PDF) of Y by w cross-multiply and add (or evaluate the integral) In particular, if X, Y are independent and normal, then W = X + Y is normal.

16

Law of iterated expectations

E [X |Y = y ] = f (y ) is a number. E [X |Y ] = f (Y ) is a random variable (the expectation is taken with respect to X). To compute E [X |Y ], rst express E [X |Y = y ] as a function of y . Law of iterated expectations: E [X ] = E [E [X |Y ]] (equality between two real numbers)

19

20

17

Law of Total Variance 18 Sum of a random number of iid RVs

Var(X |Y ) is a random variable that is a function of Y (the variance is taken with respect to X). To compute Var(X |Y ), rst express Var(X |Y = y ) = E [(X E [X |Y = y ])2 |Y = y ] as a function of y . Law of conditional variances: Var(X ) = E [Var(X |Y )] + Var(E [X |Y ]) (equality between two real numbers)

N discrete RV, Xi i.i.d and independent of N . Y = X1 + . . . + XN . Then: E [Y ] Var(Y ) = = E [X ]E [N ] E [N ]Var(X ) + (E [X ])2 Var(N )

21

22

19

Covariance and Correlation


Cov(X, Y ) = = E [(X E [X ])(Y E [Y ])] E [XY ] E [X ]E [Y ] Correlation Coecient: (dimensionless) = Cov(X, Y ) X Y [1, 1]

By denition, X, Y are uncorrelated Cov(X, Y ) = 0. If X, Y independent X and Y are uncorrelated. (the converse is not true) In general, Var(X+Y)= Var(X)+ Var(Y)+ 2 Cov(X,Y) If X and Y are uncorrelated, Cov(X,Y)=0 and Var(X+Y)= Var(X)+Var(Y)

= 0 X and Y are uncorrelated. || = 1 X E [X ] = c[Y E [Y ]] (linearly related)

23

24

Вам также может понравиться