Академический Документы
Профессиональный Документы
Культура Документы
Lect04.ppt
Contents
Basic concepts Discrete random variables Discrete distributions (nbr distributions) Continuous random variables Continuous distributions (time distributions) Other random variables
Combination of events
Union A or B: Intersection A and B: Complement not A: Events A and B are disjoint if AB= A set of events {B1, B2, } is a partition of event A if (i) Bi Bj = for all i j
A B = { | A or B} A B = { | A and B} Ac = { | A}
(ii) i Bi = A
B1 B2
B3
Probability
Probability of event A is denoted by P(A), P(A) [0,1] Probability measure P is thus a real-valued set function defined on the set of events , P: [0,1] Properties: (i) 0 P(A) 1 (ii) P() = 0 (iii) P() = 1 (iv) P(Ac) = 1 P(A) (v) P(A B) = P(A) + P(B) P(A B) (vi) A B = P(A B) = P(A) + P(B)
Conditional probability
Assume that P(B) > 0 Definition: The conditional probability of event A given that event B occurred is defined as
P( A | B) =
It follows that
P ( A B ) P( B)
P ( A B) = P( B) P( A | B ) = P( A) P ( B | A)
P ( A) = i P ( A Bi )
Assume further that P(Bi) > 0 for all i. Then (by slide 6)
(vii )
P ( A) = i P ( Bi ) P ( A | Bi )
This is the theorem of total probability B1 B2 A B3
B4
7
Bayes theorem
Let {Bi} be a partition of the sample space
Assume that P(A) > 0 and P(Bi) > 0 for all i. Then (by slide 6)
P ( Bi | A) =
P ( A Bi ) P ( Bi ) P ( A|Bi ) = P ( A) P ( A)
P ( Bi | A) =
This is Bayes theorem
P ( Bi ) P ( A| Bi ) j P ( B j ) P ( A|B j )
Probabilities P(Bi | A) are called a posteriori probabilities of events Bi (given that the event A occured)
8
P( A B ) = P( A) P ( B)
It follows that
P( A | B) =
Correspondingly:
P ( A B ) P( B)
P ( A) P ( B ) P( B)
= P ( A)
P ( B | A) = P ( A) =
P ( A B )
P ( A) P ( B ) P ( A)
= P( B)
9
Random variables
Definition: Real-valued random variable X is a real-valued and measurable function defined on the sample space , X: Each sample point is associated with a real number X( ) Measurability means that all sets of type
{ X x} : ={ | X ( ) x}
belong to the set of events , that is
{X x}
The probability of such an event is denoted by P{X
x}
10
Example
A coin is tossed three times Sample space:
X()
11
Indicators of events
Let A be an arbitrary event
1, A 1A ( ) = 0, A
Clearly:
P{1A = 1} = P( A) P{1A = 0} = P ( Ac ) = 1 P( A)
12
FX ( x) = P{ X x}
Cdf determines the distribution of the random variable, that is: the probabilities P{X B}, where B and {X B} Properties: (i) FX is non-decreasing (ii) FX is continuous from the right (iii) FX () = 0 (iv) FX () = 1
FX(x) 0 x
13
P{ X x, Y y} = P{ X x}P{Y y}
Definition: Random variables X1,, Xn are totally independent if for all i and xi
P{ X 1 x1,..., X n xn } = P{ X1 x1}L P{ X n xn }
14
P{ X max x} = P{ X 1 x, K , X n x}
= P{ X 1 x}L P{ X n x}
Denote: Xmin := min{X1,, Xn}. Then
15
Contents
Basic concepts Discrete random variables Discrete distributions (nbr distributions) Continuous random variables Continuous distributions (time distributions) Other random variables
16
P{ X S X } = 1
It follows that
Point probabilities
Let X be a discrete random variable The distribution of X is determined by the point probabilities pi,
pi := P{ X = xi },
xi S X
Definition: The probability mass function (pmf) of X is a function pX: [0,1] defined as follows:
pi , x = xi S X p X ( x) := P{ X = x} = 0, x S X
Cdf is in this case a step function:
FX ( x) = P{ X x} = pi
i: xi x
18
Example
pX(x)
FX(x)
x x1 x2 x3 x4 x1 x2 x3 x4
P{ X = xi , Y = y j } = P{ X = xi }P{Y = y j }
20
Expectation
Definition: The expectation (mean value) of X is defined by
X := E[ X ] := P{ X = x} x = p X ( x) x = pi xi
xS X xS X i
Note 1: The expectation exists only if
i pi|xi| <
Properties: (i) c E[cX] = cE[X] (ii) E[X + Y] = E[X] + E[Y] (iii) X and Y independent E[XY] = E[X]E[Y]
21
Variance
Definition: The variance of X is defined by
2 X := D 2 [ X ] := Var[ X ] := E[( X E[ X ]) 2 ]
Useful formula (prove!):
D 2 [ X ] = E[ X 2 ] E[ X ]2
Properties: (i) c D2[cX] = c2D2[X] (ii) X and Y independent D2[X + Y] = D2[X] + D2[Y]
22
Covariance
Definition: The covariance between X and Y is defined by
Cov[ X , Y ] = E[ XY ] E[ X ]E[Y ]
Properties: (i) Cov[X,X] = Var[X] (ii) Cov[X,Y] = Cov[Y,X] (iii) Cov[X+Y,Z] = Cov[X,Z] + Cov[Y,Z] (iv) X and Y independent Cov[X,Y] = 0
23
X := D[ X ] :=
D 2 [ X ] = Var [ X ]
c X := C[ X ] := E[ X ]
D[ X ]
k) k ( : = E [ X ] X
24
X n := 1 n
Then (prove!)
i =1
Xi
E[ X n ] =
2 D [Xn] = n D[ X n ] = n
25
P{| X n |> } 0
Strong law of large numbers: with probability 1
Xn
26
Contents
Basic concepts Discrete random variables Discrete distributions (nbr distributions) Continuous random variables Continuous distributions (time distributions) Other random variables
27
P{ X = 0} = 1 p,
P{ X = 1} = p
n i P{ X = i} = i p (1 p ) n i
Mean value: E[X] = E[X1] + + E[Xn] = np Variance: D2[X] = D2[X1] + + D2[Xn] = np(1 p) (independence!)
29
()
p = probability of success in any single experiment Value set: SX = {0,1,} Point probabilities:
P{ X = i} = p i (1 p )
Mean value: E[X] = i ipi(1 p) = p/(1 p) Second moment: E[X2] = i i2pi(1 p) = 2(p/(1 p))2 + p/(1 p) Variance: D2[X] = E[X2] E[X]2 = p/(1 p)2
30
P{ X i + j | X i} = P{ X j}
Prove!
Tip: Prove first that P{X i} = pi
31
P{ X
Prove!
min
1 pi = X i } = 1 p p , i {1,2} 1 2
32
i a a P{ X = i} = e i!
33
Example
Assume that 200 subscribers are connected to a local exchange each subscribers characteristic traffic is 0.01 erlang
subscribers behave independently
Then the number of active calls X Bin(200,0.01) Corresponding Poisson-approximation X Poisson(2.0) Point probabilities:
0 1 2 3 4 5
Bin(200,0.01) .1326 .2679 .2693 .1795 .0893 .0354 Poisson(2.0) .1353 .2701 .2701 .1804 .0902 .0361
34
Properties
Then
X 1 + X 2 Poisson (a1 + a2 )
(ii) Random sample: Let X Poisson(a) denote the number of elements in a set, and Y denote the size of a random sample of this set (each element taken independently with probability p). Then
Y Poisson ( pa )
(iii) Random sorting: Let X and Y be as in (ii), and Z = X Y. Then Y and Z are independent (given that X is unknown) and
Contents
Basic concepts Discrete random variables Discrete distributions (nbr distributions) Continuous random variables Continuous distributions (time distributions) Other random variables
36
FX ( x) := P{ X x} = f X ( y ) dy
The function fX is called the probability density function (pdf) The set SX, where fX > 0, is called the value set Properties: (i) P{X = x} = 0 for all x
(ii) P{a < X < b} = P{a X b} = ab fX(x) dx (iii) P{X A} = A fX(x) dx (iv) P{X } = - fX(x) dx = S fX(x) dx = 1
X
37
Example
fX(x)
FX(x)
x x1 x2 x3 x1 x2 x3
SX = [x1, x3]
38
The expectation has the same properties as in the discrete case (see slide 21)
Note 1: The expectation exists only if - fX(x)|x| dx < Note 2: If - fX(x)x = , then we may denote E[X] =
X := E[ X ] := f X ( x) x dx
The other distribution parameters (variance, covariance,...) are defined just as in the discrete case
These parameters have the same properties as in the discrete case (see slides 22-24)
39
Contents
Basic concepts Discrete random variables Discrete distributions (nbr distributions) Continuous random variables Continuous distributions (time distributions) Other random variables
40
1 f X ( x) = , x ( a, b) ba
Cumulative distribution function (cdf):
ba Mean value: E[X] = ab x/(b a) dx = (a + b)/2 Second moment: E[X2] = ab x2/(b a) dx = (a2 + ab + b2)/3 Variance: D2[X] = E[X2] E[X]2 = (b a)2/12
41
FX ( x) := P{ X x} = x a , x (a, b)
f X ( x) = e x , x > 0
Cumulative distribution function (cdf):
FX ( x) = P{ X x} = 1 e x , x > 0
Mean value: E[X] = 0 x exp(x) dx = 1/ Variance: D2[X] = E[X2] E[X]2 = 1/2 Second moment: E[X2] = 0 x2 exp(x) dx = 2/2
42
Application:
Assume that the call holding time is exponentially distributed with mean h (min). Consider a call that has already lasted for x minutes. Due to memoryless property, this gives no information about the length of the remaining holding time: it is distributed as the original holding time and, on average, lasts still h minutes!
43
P{ X min = X i } = +i , i {1,2} 1 2
Prove!
Tip: See slide 15
44
f X ( x) = ( x) :=
Cumulative distribution function (cdf):
1 2
2 1 x 2
FX ( x) := P{ X x} = ( x) := ( y ) dy
45
, > 0
( )
x
FX ( x) := P{ X x} = P
}= ( )
x
46
Mean value: E[X] = + E[(X )/] = (symmetric pdf around ) Variance: D2[X] = 2D2[(X )/] = 2
Y := X + N( + , )
Then
X n := 1 n
i =1
1 2) X N ( , i n
47
1 (X ) N(0,1) n / n
It follows that
i.d.
2 X n N( , 1 ) n
48
Contents
Basic concepts Discrete random variables Discrete distributions (nbr distributions) Continuous random variables Continuous distributions (time distributions) Other random variables
49
Example:
The customer waiting time W in an M/M/1 queue has an atom at zero (P{W = 0} = 1 > 0) but otherwise the distribution is continuous
FW(x) 1 0 0 x
50