Академический Документы
Профессиональный Документы
Культура Документы
1. 0 F (x) 1
1
CHAPTER 4. DISCRETE RANDOM VARIABLES 2
Example 4.1
Suppose an insurance company pays the amount of 1,000 for lost luggage on
an airplane trip. From past experience, it is known that the company pays this
amount in 1 out of 200 policies it sells. What premium should the company charge?
Solution
x 0 1, 000 Total
f (x) 0.995 0.005 1
Then the expected loss to the company is: E[X] = 0 0.995 + (1, 000 0.005) =
5. Thus, the company must charge 5 to break even.
Properties of mean:
1. E[cX] = cE[X]
Proof
n
X n
X
E[cX] = cxi f (xi ) = c xi f (xi ) = cE[X]
i=1 i=1
2. E[cX + d] = cE[X] + d
Proof
n
X n
X n
X
E[cX + d] = (cxi + d)f (xi ) = cxi f (xi ) + f (xi )
i=1 i=1 i=1
Xn n
X
= c xi f (xi ) + d f (xi ) = cE[X] + d
i=1 i=1
The variance account for the difference in size of the range of the distributions
of r.v. More generally, for a r.v. X taking on finitely many values x1 , ..., xn with
respective probabilities
Pn f (x1 ), ..., f (xn ), the variance is:
V ar[X] = i=1 (xi E[X])2 f (xi )
and represents the sum of the weighted squared distances of the points xi ,
i = 1, ..., n from the center of location of the distribution, E[X]. Thus, the further
from E[X] the xi s are located, the larger the variance, and vice versa. Because
of this characteristic property of the variance, the variance is referred to as a
measure of dispersion of the underlying distribution.
The positive square root of the V ar[X] is called the standard deviation (s.d.)
of X. Unlike the variance, the s.d. is measured in the same units as X.
Properties of variance:
Proof
n
X
2
V ar[X] = E[X E[X]] = (xi E[X])2 f (xi )
i=1
n
X
= (x2i 2xi E[X] + E[X]2 )f (xi )
i=1
n
X n
X n
X
= x2i f (xi ) 2E[X] xi f (xi ) + E[X]2
f (xi )
i=1 i=1 i=1
= E[X 2 ] 2E[X]E[X] + E[X]2 = E[X 2 ] E[X]2
2. V ar[cX] = c2 V ar[X]
Proof
3. V ar[cX + d] = c2 V ar[X]
Proof
CHAPTER 4. DISCRETE RANDOM VARIABLES 4
Proof
Example 4.2
Solution
1.
CHAPTER 4. DISCRETE RANDOM VARIABLES 5
Example 4.3
The number of light switch turn-ons at which the first failure occurs is a r.v.
9 x1
X whose p.d.f. is given by: f (x) = c 10 , x = 1, 2, ... (and 0 otherwise).
(a) Determine the constant c.
(b) Calculate the probability that the first failure will not occur until the
10th turn-on.
(c) Determine the corresponding c.d.f. F .
Solution
P
(a) The constant c is determined through the relationship: f (x) = 1
P 9 x1
P 9 x1
Px=1 9 x1
or x=1 c 10 = 1. However, x=1 c 10 = c x=1 10 =
1 1
c 1 9 = 10c, so that c = 10 .
10
h i
9 x1 9 10 9 11
P
(b) P (X > 10) = P (X 11) = c x=11 10 = c 10 + 10 + ...
10
( 109 ) 9 10 1 9 10
= (0.9)10 0.349.
=c 9
1 10
= c10 10
= 10
10 10
9 t1
(c) First, for x < 1, F (x) = 0. Next, for x 1, F (x) = xt=1 c 10
P
=
9 x
P t1 t1 ( ) 9 x
9
= 1c 9 1
P
1 t=x+1 c 10 t=x+1 10 = 1 10 110
9 = 1 10 .
10
9
x
Thus, F (x) = 0 for x < 1, and F (x) = 1 10 for x 1.
CHAPTER 4. DISCRETE RANDOM VARIABLES 6
Exercises
Exercise 4.1
A roulette wheel consists of 18 black slots, 18 red slots, and 2 green slots.
If a gambler bets 10 on red, what is the gamblers expected gain or loss?
Solution
Exercise 4.2
Solution
R1
(a) We need two relations which are provided by: 0 (cx + d)dx = 1 and
R1 1
1 (cx + d)dx = , or: c + 2d = 2 and 9c + 12d = 8, and hence c =
2
3
34 , d = 35 .
Rx 2
(b) For 0 x 1, F (x) = 0 43 t + 35 )dt = 2x3 + 5x
3
. Thus,
0, x<0
2x2 5x
F (x) = 3 + 3, 0x1 (4.1)
1, x > 1.
CHAPTER 4. DISCRETE RANDOM VARIABLES 7
P (X = 0) = 1 p
P (X = 1) = p
where p represents the probability of a successful outcome. A random vari-
able that follows the probability density function given above for 0 < p < 1
is called a Bernoulli random variable.
Now suppose we repeat this experiment for n trials, where each trial is in-
dependent (the outcome from one trial does not influence the outcome of
another) and results in a success with probability p. If X denotes the num-
ber of successes in these n trials, then X follows the binomial distribution
which we will denote X B(n, p).
To calculate a binomial probability, we use the following formula:
n x
P (X = x) = p (1 p)nx x = 0, 1, 2, . . . , n.
x
n n!
=
x x!(n x)!
For selected n and p, the c.d.f. F (x) = xj=0 nj pj (1 p)nj is given in table
P
Example 4.4
Solution
X B(20, 0.2)
From Table 1 P (X 3) = 0.4114
Answer: 0.4114
E[X] = np,
and
Proof
For expectation:
n n
X X X n i
E(X) = xi f(xi ) = i f(i) = i p (1 p)ni
i i=0 i=0
i
n n
X n! X n (n 1)!
= i pi (1 p)ni = i p pi1 (1 p)ni
i=1
i!(n i)! i=1
i (i 1)!(n i)!
n
X (n 1)!
= np pi1 (1 p)ni
i=1
(i 1)!(n i)!
m m
X (m)! s ms
X m s
= np p (1 p) = np p (1 p)ms
s=0
(s)!(m s)! s=0
s
= np 1 = np
CHAPTER 4. DISCRETE RANDOM VARIABLES 9
For variance:
Exercises
Exercise 4.3
Solution
Let X be the number of those favoring the proposal, then X B(15, 0.4375).
(a) P (X 5) = 1 P (X 4) = 1 (P (X = 0) + P (X = 1) + P (X =
2) + P (X= 3) + P (X = 4)) =
= 1 15 0 15 15 1 14 15
0.43752 0.562513 +
0
0.4375 0.5625 + 1
0.4375 0.5625 + 2
15 15
0.43753 0.562512 + 0.43754 0.562511 = 0.859;
+ 3 4
(b) P (X 8) = 1P (X 7) = 1(P (X 5)+P (X = 6)+P (X = 7)) =
15 15
0.43756 0.56259 + 0.43757 0.56258 = 0.3106.
= 1 0.859 + 6 7
CHAPTER 4. DISCRETE RANDOM VARIABLES 10
Exercise 4.4
Suppose you are throwing darts at a target and that you hit the bull s eye
with probability p. It is assumed that the trials are independent and that p
remains constant throughout.
(a) If you throw darts 100 times,what is the probability that you hit the
bull s eye at least 40 times?
(b) What does this expression become for p = 0.25?
(c) What is the expected number of hits, and what is the s.d. around this
expected number?
Solution
Let X is the number of times the bulls eye is hit, then X B(100, p).
x
P (X = x) = e , x = 0, 1, . . .
x!
The expected value and variance of a Poisson random variable are both ,
thus,
E[X] = ,
and
V ar[X] = .
Example 4.5
Solution
3
X
P (X 3) = P (X = k)
k=0
4 4 42 4 43 4
P (X 3) = e + 4e + e + e = 0.4335
2! 3!
Answer: 0.4335
Table 2 of the NCST gives the probability that a Poisson random variable
with mean will be less than or equal to r. These tables can be used in the
same way as the binomial tables.
Exercises
Exercise 4.5
Solution
Answer: 0.342.
Exercise 4.6
Solution
Answer: 0.82.
for any real s for which the sum converges. If we expand the sum we see
that the coefficient of sx is the probability that X = x. The quantity s
is a dummy
P variable, we could have used any letter we chose. Note that
GX (1) = x P (X = x) = 1.
Now we can see why GX is useful if we differentiate it with respect to s
X
GX (s) = xsx1 P (X = x)
x
and put s = 1 X
GX (1) = xP (X = x) = [X].
x
and put s = 1
X
GX (1) = x(x 1)P (X = x) = [X(X 1)].
x
CHAPTER 4. DISCRETE RANDOM VARIABLES 14
Now
In general
k
E[X(X 1)(X 2)...(X k + 1)] = d GdsXk(1)
k
where d G X (l)
dsk
is the kth derivative of X (s) wrt s and evaluated at s = 1.
Note that thePprobability generating function is only going to be useful if we
can simplify x sx P (X = x).
Example 4.6
Solution
P x ax e a1 a1 (sa1 )x
(a) GX (s) = E[sx ] =
P x
P
x=0 s P (X = x) = x=0 s x!
= e x=0 x!
=
s
a1 1
(s1)
e e =e a a
1
(b) GX (s) = a1 e a (s1)
E[X] = Gx (1) = a1
1
(c) GX (s) = a12 e a (s1)
GX (1) = a12
1 1 1 1
V ar[X] = GX (1) + GX (1) (GX (1))2 = a2
+ a
a2
= a
Exercises
Exercise 4.7
Solution
Pn x n x
Pn n
GX (s) = E[sx ] = nx x nx
x=0 s x p (1 p) = x=0 x (ps) (1 p) =
n
(ps + (1 p))
GX (s) = n (p + (1 p))n1 p
E[X] = GX (1) = n(p + (1 p))n1 p = np
GX (s) = n(n 1)(ps + (1 p))n2 p2
GX (1) = n(n 1)(p + (1 p))n2 p2 = n(n 1)p2
V ar[X] = GX (1) + GX (1) (GX (1))2 = n(n 1)p2 + np (np)2 = np(1 p)
Exercise 4.8
Find the probability generating function for this distribution and hence show
that it has mean (1 p)/p and find the variance.
Solution
(s(1 p))x = p
P P
GX (s) = E[sx ] = x=0 sx p(1 p)x = p x=0 1s(1p)
p(1p)
GX (s) = (1s(1p)) 2
1p
E[x] = GX (1) = p
2p(1p)2
GX (s) = (1s(1p))3
2
GX (1) = 2(1p)
p2
2(1p)2 1p (1p)2 1p
V ar[X] = GX (1) + GX (1) (GX (1))2 = p2
+ p
p2
= p2