Вы находитесь на странице: 1из 165

ECE342- Probability for Electrical & Computer Engineers

C. Tellambura and M. Ardakani

Winter 2013 Copyright 2013 C. Tellambura and M. Ardakani. All rights reserved.

Contents
1 Basics of Probability Theory 1.1 Set theory . . . . . . . . . . . . . . . . . . 1.1.1 Basic Set Operations . . . . . . . . 1.1.2 Algebra of Sets . . . . . . . . . . . 1.2 Applying Set Theory to Probability . . . . 1.3 Probability Axioms . . . . . . . . . . . . . 1.4 Some Consequences of Probability Axioms 1.5 Conditional probability . . . . . . . . . . . 1.6 Independence . . . . . . . . . . . . . . . . 1.7 Sequential experiments and tree diagrams 1.8 Counting Methods . . . . . . . . . . . . . 1.9 Reliability Problems . . . . . . . . . . . . 1.10 Illustrated Problems . . . . . . . . . . . . 1.11 Solutions for the Illustrated Problems . . . 1.12 Drill Problems . . . . . . . . . . . . . . . . 2 Discrete Random Variables 2.1 Denitions . . . . . . . . . . . . . . . . . . 2.2 Probability Mass Function . . . . . . . . . 2.3 Cumulative Distribution Function (CDF) . 2.4 Families of Discrete RVs . . . . . . . . . . 2.5 Averages . . . . . . . . . . . . . . . . . . . 2.6 Function of a Random Variable . . . . . . 2.7 Expected Value of a Function of a Random 2.8 Variance and Standard Deviation . . . . . 2.9 Conditional Probability Mass Function . . 2.10 Basics of Information Theory . . . . . . . 2.11 Illustrated Problems . . . . . . . . . . . . 2.12 Solutions for the Illustrated Problems . . . 2.13 Drill Problems . . . . . . . . . . . . . . . . 1 1 1 2 2 2 3 3 4 4 4 5 5 11 20 29 29 29 30 30 31 32 32 33 33 34 35 39 46

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Variable . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . .

. . . . . . . . . . . . .

. . . . . . . . . . . . .

. . . . . . . . . . . . .

. . . . . . . . . . . . .

. . . . . . . . . . . . .

. . . . . . . . . . . . .

. . . . . . . . . . . . .

iv 3 Continuous Random Variables 3.1 Cumulative Distribution Function . . . . . 3.2 Probability Density Function . . . . . . . . 3.3 Expected Values . . . . . . . . . . . . . . 3.4 Families of Continuous Random Variables 3.5 Gaussian Random Variables . . . . . . . . 3.6 Functions of Random Variables . . . . . . 3.7 Conditioning a Continuous RV . . . . . . . 3.8 Illustrated Problems . . . . . . . . . . . . 3.9 Solutions for the Illustrated Problems . . . 3.10 Drill Problems . . . . . . . . . . . . . . . . 4 Pairs of Random Variables 4.1 Joint Probability Mass Function . . . 4.2 Marginal PMFs . . . . . . . . . . . . 4.3 Joint Probability Density Function . 4.4 Marginal PDFs . . . . . . . . . . . . 4.5 Functions of Two Random Variables . 4.6 Expected Values . . . . . . . . . . . 4.7 Conditioning by an Event . . . . . . 4.8 Conditioning by an RV . . . . . . . . 4.9 Independent Random Variables . . . 4.10 Bivariate Gaussian Random Variables 4.11 Illustrated Problems . . . . . . . . . 4.12 Solutions for the Illustrated Problems 4.13 Drill Problems . . . . . . . . . . . . .

CONTENTS 55 55 56 56 57 58 59 60 60 64 69 75 75 75 76 76 76 77 78 78 79 80 80 83 89

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . . . . .

. . . . . . . . . . . . .

. . . . . . . . . . . . .

. . . . . . . . . . . . .

. . . . . . . . . . . . .

. . . . . . . . . . . . .

. . . . . . . . . . . . .

. . . . . . . . . . . . .

. . . . . . . . . . . . .

. . . . . . . . . . . . .

. . . . . . . . . . . . .

. . . . . . . . . . . . .

. . . . . . . . . . . . .

. . . . . . . . . . . . .

. . . . . . . . . . . . .

. . . . . . . . . . . . .

5 Sums of Random Variables 5.1 Summary . . . . . . . . . . . . . . . . . . . 5.1.1 PDF of sum of two RVs . . . . . . . 5.1.2 Expected values of sums . . . . . . . 5.1.3 Moment Generating Function (MGF) 5.2 Illustrated Problems . . . . . . . . . . . . . 5.3 Solutions for the Illustrated Problems . . . . 5.4 Drill Problems . . . . . . . . . . . . . . . . . A 2009 Quizzes A.1 Quiz Number A.2 Quiz Number A.3 Quiz Number A.4 Quiz Number A.5 Quiz Number A.6 Quiz Number

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

93 . 93 . 93 . 93 . 94 . 94 . 96 . 100 103 103 104 105 106 107 108

1 2 3 4 5 6

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

CONTENTS

A.7 Quiz Number 7 . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109 A.8 Quiz Number 8 . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110 B 2009 Quizzes: Solutions B.1 Quiz Number 1 . . . . B.2 Quiz Number 2 . . . . B.3 Quiz Number 3 . . . . B.4 Quiz Number 4 . . . . B.5 Quiz Number 5 . . . . B.6 Quiz Number 6 . . . . B.7 Quiz Number 7 . . . . B.8 Quiz Number 8 . . . . C 2010 Quizzes C.1 Quiz Number C.2 Quiz Number C.3 Quiz Number C.4 Quiz Number C.5 Quiz Number C.6 Quiz Number C.7 Quiz Number 111 111 113 114 116 118 120 122 123 125 125 126 127 128 129 130 131 133 133 135 136 138 139 140 141 143 143 144 145 146 147 148 149 149 151 153 155 157

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

1 2 3 4 5 6 7

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

D 2010 Quizzes: Solutions D.1 Quiz Number 1 . . . . D.2 Quiz Number 2 . . . . D.3 Quiz Number 3 . . . . D.4 Quiz Number 4 . . . . D.5 Quiz Number 5 . . . . D.6 Quiz Number 6 . . . . D.7 Quiz Number 7 . . . . E 2011 Quizzes E.1 Quiz Number E.2 Quiz Number E.3 Quiz Number E.4 Quiz Number E.5 Quiz Number E.6 Quiz Number

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

1 2 3 4 5 6

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

F 2011 Quizzes: Solutions F.1 Quiz Number 1 . . . . F.2 Quiz Number 2 . . . . F.3 Quiz Number 3 . . . . F.4 Quiz Number 5 . . . . F.5 Quiz Number 6 . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

Chapter 1 Basics of Probability Theory


Goals of EE387
Introduce the basics of probability theory, Apply probability theory to solve engineering problems. Develop intuition into how the theory applies to practical situations.

1.1

Set theory

A set can be described by the tabular method or the description method. Two special sets: (1) The universal set S and (2) The null set .

1.1.1

Basic Set Operations

|A|: cardinality of A. A B = {x|x A or x B}: union - Either A or B occurs or both occur. A B = {x|x A and x B}: intersection - both A and B occur. A B = {x A and x B}: set dierence / Ac = {x | x S and x A}: complement of A. /
n

Ak = A1 A2 . . . An : Union of n 2 events - one or more of Ak s occur. Ak = A1 A2 . . . An : Intersection of n 2 events - all Ak s occur simulta-

k=1 n k=1

neously. Denition 1.1: A and B are disjoint if A B = . Denition 1.2: A collection of events A1 , A2 , . . . , An (n 2) is mutually exclusive if all pairs of Ai and Aj (i = j) are disjoint.

Basics of Probability Theory

1.1.2

Algebra of Sets

1. Union and intersection are commutative. 2. Union and intersection are distributive. 3. (A B)c = Ac B c - De Morgans law. 4. Duality Principle

1.2

Applying Set Theory to Probability

Denition 1.3: An experiment consists of a procedure and observations. Denition 1.4: An outcome is any possible observation of an experiment. Denition 1.5: The sample space S of an experiment is the nest-grain, mutually exclusive, collectively exhaustive set of all possible outcomes. Denition 1.6: An event is a set of outcomes of an experiment. Denition 1.7: A set of mutually exclusive sets (events) whose union equals the sample space is an event space of S. Mathematically, Bi Bj = for all i = j and B1 B2 . . . Bn = S. Theorem 1.1: For an event space B = {B1 , B2 , , Bn } and any event A S, let Ci = A Bi , i = 1, 2, , n. For i = j, the events Ci and Cj are mutually exclusive, i.e., Ci Cj = , and A =
n

Ci .

i=1

1.3

Probability Axioms

Denition 1.8: Axioms of Probability: A probability measure P [] is a function that maps events in S to real numbers such that: Axiom 1. For any event A, P [A] 0. Axiom 2. P [S] = 1. Axiom 3. For any countable collection A1 , A2 , of mutually exclusive events P [A1 A2 ] = P [A1 ] + P [A2 ] +

1.4 Some Consequences of Probability Axioms Theorem 1.2: P [A] =


m i=1

If A = A1 A2 Am and Ai Aj = for i = j, then

P [Ai ].

Theorem 1.3: The probability of an event B = {s1 , s2 , , sm } is the sum of the probabilities of the outcomes in the event, i.e., P [B] =
m i=1

P [{si }].

1.4

Some Consequences of Probability Axioms

Theorem 1.4: The probability measure P [] satises 1. P [] = 0. 2. P [Ac ] = 1 P [A]. 3. For any A and B (not necessarily disjoint), P [AB] = P [A]+P [B]P [AB]. 4. If A B, then P [A] P [B]. Theorem 1.5: For any event A and event space B = {B1 , B2 , , Bm } , P [A] =
m i=1

P [A Bi ].

1.5

Conditional probability

The probability in Section 1.3 is also called a priori probability. If an event has happened, this information can be used to update the a priori probability. Denition 1.9: The conditional probability of event A given B is P [A|B] = P [A B] . P [B]

To calculate P [A|B], nd P [A B] and P [B] rst. Theorem 1.6 (Law of total probability): {B1 , B2 , , Bm } with P [Bi ] > 0 for all i, P [A] =
m i=1

For an event space

P [A|Bi ]P [Bi ].

Theorem 1.7 (Bayes Theorem): P [B|A] =

P [A|B]P [B] . P [A]

Basics of Probability Theory

Theorem 1.8 (Bayes Theorem- Expanded Version): P [A|Bi ]P [Bi ] P [Bi |A] = m . i=1 P [A|Bi ]P [Bi ]

1.6

Independence

Denition 1.10: Events A and B are independent if and only if P [A B] = P [A]P [B]. Relationship with conditional probability: P [A|B] = P [A], P [B|A] = P [B] when A and B are independent. Denition 1.11: Events A, B and C are independent if and only if P [A B] = P [A]P [B] P [B C] = P [B]P [C] P [A C] = P [A]P [C] P [A B C] = P [A]P [B]P [C].

1.7

Sequential experiments and tree diagrams

Many experiments consist of a sequence of trials (subexperiments). Such experiments can be visualized as multiple stage experiments. Such experiments can be conveniently represented by tree diagrams. The law of total probability is used with tree diagrams to compute event probabilities of these experiments.

1.8

Counting Methods

Denition 1.12: If task A can be done in n ways and B in k way, then A and B can be done in nk ways. Denition 1.13: If task A can be done in n ways and B in k way, then either A or B can be done in n + k ways. Here are some important cases: The number of ways to choose k objects out of n distinguishable objects (with replacement and with ordering) is nk .

1.9 Reliability Problems

The number of ways to choose k objects out of n distinguishable objects (without replacement and with ordering) is n(n 1) (n k + 1). The number of ways to choose k objects out of n distinguishable objects ( ) n! (without replacement and without ordering) is n = . k k!(n k)! Number of permutations on n objects out of which n1 are alike, n2 are alike, n! . . ., nR are alike: . n1 !n2 ! nR !

1.9

Reliability Problems
n i=1

For n independent systems in series: P [W ] =

P [Wi ].
n i=1

For n independent systems in parallel: P [W ] = 1

(1 P [Wi ]).

1.10

Illustrated Problems
a) If A = {x2 |0 < x < 2, x R} and B = {2x|0 < x < 2, x R} then A = B. b) If A B then A B = A c) If A B and B C then A C d) For any A, B and C, A B A C e) There exist a set A for which (A c )c S = A (S is the universal set). f) For a sample space S and two events A and C, dene B1 = A C,B2 = Ac C, B3 = A C c and B4 = Ac C c . Then {B1 , B2 , B3 , B4 }is an event space.

1. True or False. Explain your answer in one line.

2. Using the algebra of sets, prove a) A (B C) = (A B) (A C), b) A (A B) = A B. 3. Sketch A B for a) A B, b) B A, c) A and B are disjoint.

Basics of Probability Theory 4. Consider the following subsets of S = {1, 2, 3, 4, 5, 6}: R1 = {1, 2, 5}, R2 = {3, 4, 5, 6}, R3 = {2, 4, 6}, R4 = {1, 3, 6}, R5 = {1, 3, 5}. Find: a) R1 R2 , b) R4 R5 ,
c c) R5 ,

d) (R1 R2 ) R3 ,
c e) R1 (R4 R5 ),

f) (R1 (R2 R3 ))c ,


c c g) ((R1 R2 ) (R4 R5 ))c

h) Write down a suitable event space. 5. Express the following sets in R as a single interval: a) ((, 1) (4, ))c , b) [0, 1] [0.5, 2], c) [1, 0] [0, 1]. 6. By drawing a suitable Venn diagram, convince yourself of the following: a) A (A B) = A, b) A (A B) = A. 7. Three telephone lines are monitored. At a given time, each telephone line can be in one of the following three modes: (1) Voice Mode, i.e., the line is busy and someone is speaking (2) Data Mode, i.e., the line is busy with a modem or fax signal and (3) Inactive Mode, i.e., the line is not busy. We show these three modes with V, D and I respectively. For example if the rst and second lines are in Data Mode and the third line is in Inactive Mode, the observation is DDI. a) Write the elements of the event A= {at least two Voice Modes} b) Write the elements of B= {number of Data Modes > 1+ number of Voice modes} 8. The data packets that arrive at an Internet switch are buered to be processed. When the buer is full, the arrived packet is dropped and the transmission must be repeated. To study this system, at the arrival time of any new packet, we observe the number of packets that are already stored in the buer. Assuming that the switch can buer a maximum of 5 packets, the used buer at any given time is 0, 1, 2, 3, 4 or 5 packets. Thus the sample space for this experiment is S = {0, 1, 2, 3, 4, 5}. This experiment is repeated 500 times and the following data is recorded.

1.10 Illustrated Problems Used buer 0 1 2 3 4 5 Number of times observed 112 119 131 85 43 10

A The relative frequency of an event A is dened as nn , where nA is the number of timesA occurs and n is the total number of observations.

a) Consider the following three exclusively mutual events:A = {0, 1, 2}, B = {3, 4}, C = {5}. Find the relative frequency of these events. b) Show that the relative frequency of A B C is equal to the sum of the relative frequencies of A, B and C. 9. Consider an elevator in a building with four stories, 1-4, with 1 being the ground oor. Three people enter the elevator on oor 1 and push buttons for their destination oors. Let the outcomes be the possible stopping patterns for all passengers to leave the elevator on the way up. For example, 2-2-4 means the elevator stops on oors 2 and 4. Therefore, 2-2-4 is an outcome in S. a) List the sample space, S, with its elements (outcomes). b) Consider all outcomes equally likely. What is the probability of each outcome? c) Let E = {stops only on even oors} and T = {stops only twice}. Find P[E] and P[T ]. d) Find P [E T ] e) Find P [E T ] f) Is P [EU T ] = P [E] + P T ]? Does this contradict the third axiom of probability? 10. This problem requires the use of event spaces. Consider a random experiment and four events A, B, C, and D such that A and B form an event space and also C and D form an event space. Furthermore, P [A C] = 0.3 and P [B D] = 0.25. a) Find P [A C]. b) If P [D] = 0.58, nd P [A]. 11. Prove the following inequalities:

8 a) P [A B] P [A] + P [B]. b) P [A B] P [A] + P [B] 1.

Basics of Probability Theory

12. This problem requires the law of total probability and conditional probability. A study on relation between the family size and the number of cars reveals the following probabilities. Number of Cars Family size S: Small (2 or less) M: Medium (3, 4 or 5) L: Large(more than 5) Answer the following questions: a) What is the probability of a random family having less than 2 cars? b) Given that a family has more than 2 cars, what is the probability that this family be large? c) Given that a family has less than 2 cars, what is the probability that this family be large? d) Given that the family size is not medium, what is the probably of having one car? 13. A communication channel model is shown Fig. 1.1. The input is either 0 or 1, and the output is 0, 1 or X, where X represents a bit that is lost and not arrived at the channel output. Also, due to noise and other imperfections, the channel may transmit a bit in error. When Input = 0, the correct output (Output = 0) occurs with a probability of 0.8, the incorrect output (Output = 1) occurs with a probability of 0.1, and the bit is lost (Output = X) with a probability of 0.1. When Input = 1, the correct output (Output = 1) occurs with a probability of 0.7, the wrong output (Output = 0) occurs with a probability of 0.2, and the bit is lost (Output = X) with a probability of 0.1. Assume that the inputs 0 and 1 are equally likely (i.e. P [0] = P [1]). a) If Output = 1, what is the probability of Input = 1? b) If the output is X, what is the probability of Input = 1, and what is the probability of Input = 0? c) Repeat part a), but this time assume that the inputs are not equally likely and P [0] = 3P [1]. 14. This problem requires Bayes theorem. Considering all the other evidences Sherlock was 60% certain that Jack is the criminal. This morning, he found 0 0.04 0.02 0.01 1 0.14 0.33 0.03 2 0.02 0.23 0.13 More than 2 0.00 0.02 0.03

1.10 Illustrated Problems

Input 0

Output 0 X

Figure 1.1: Communication Channel

another piece of evidence proving that the criminal is left handed. Dr. Watson just called and informed Sherlock that on average 20% of people are left handed and that Jack is indeed left handed. How certain of the guilt of Jack should Sherlock be after receiving this call? 15. This problem requires Bayes theorem. Two urns A and B each have 10 balls. Urn A has 3 green, 2 red and 5 white balls and Urn B has 1 green, 6 red and 3 white balls. One urn is chosen at (equally likely) and one ball is drawn from it (balls are also chosen equally likely) a) What is the probability that this ball is red? b) Given that the drawn ball is red, what the probability that Urn A was selected? c) Suppose the drawn ball is green. Now we return this green ball to the other urn and draw a ball from it (from the urn that received the green ball). What is the probability that this ball is red? 16. Two urns A with 1 blue and 6 red balls and B with 6 blue and 1 red balls are present. Flip a coin. If the outcomes is H, put one random ball from A in B, and if the outcome is T , put one random ball from B in A. Now draw a ball from A. If blue, you win. If not, draw a ball from B, if blue you win, if red, you lose. What is the probability of wining this game? 17. Two coins are in an urn. One is fair with P [H] = P [T ] = 0.5, and one is biased with P [H] = 0.25 and P [T ] = 0.75. One coin is chosen at random (equally likely) and is tossed three times. a) Given that the biased coin is selected what is the probability of T T T ? b) Given that the biased coin is selected and that the outcome of the rst tree tosses in T T T , what is the probability that the next toss is T ?

10

Basics of Probability Theory


B

Figure 1.2: for question 20. c) This time, assume that we do not know which coin is selected. We observe that the rst three outcomes are T T T . What is the probability that the next outcome is T ? d) Dene two events E1: the outcomes of the rst three tosses are T T T ; E2: the forth toss is T . Are E1 and E2 independent? e) Given that the biased coin is selected, are E1 and E2 independent? 18. Answer the following questions about rearranging the letters of the word toronto a) How many dierent orders are there? b) In how many of them does r appear before n? c) In how many of them the middle letter is a consonant? d) How many do not have any pair of consecutive os? 19. Consider a class of 14 girls and 16 boys. Also two of the girls are sisters. A team of 8 players are selected from this class at random. a) What is the probability that the team consists of 4 girls and 4 boys? b) What is the probability that the team be uni-gender (all boys or all girls)? c) What is the probability that the number of girls be greater than the number of boys? d) What is the probability that both sisters are in the team? 20. In the network (Fig. 1.2), a data packet is sent from A to B. In each step, the packet can be sent one block either to the right or up. Thus, a total of 9 steps are required to reach B. a) How many paths are there from A to B? b) If one of these paths are chosen randomly (equally likely), what is the probability that it pass through C?

1.11 Solutions for the Illustrated Problems


R 1 R R b R

11

Figure 1.3: Question 22. 21. A binary communication system transmits a signal X that is either a + 2 voltage signal or a 2 voltage signal. These voltage signals are equally likely. A malicious channel reduces the magnitude of the received signal by the number of heads it counts in two tosses of a coin. Let Y be the resulting signal. a) Describe the sample space in terms of input-output pairs. b) Find the set of outcomes corresponding to the event transmitted signal was denitely +2. c) Describe in words the event corresponding to the outcome Y = 0. d) Use a tree diagram to nd the set of possible input-output pairs. e) Find the probabilities of the input-output pair. f) Find the probabilities of the output values. g) Find the probability that the input was X = +2 given that Y = k for all possible values of k. 22. In a communication system the signal sent from point a to point b arrives along two paths in parallel (Fig. 1.3). Over each path the signal passes through two repeaters in series. Each repeater in Path 1 has a 0.05 probability of failing (because of an open circuit). This probability is 0.08 for each repeater on Path 2. All repeaters fail independently of each other. a) Find the probability that the signal will not arrive at point b.

1.11
1.

Solutions for the Illustrated Problems


a) True. They both contain all real numbers between 0 and 4. b) False. A B = B c) True. x A x B x C, therefore: A C. d) True. Because (A B) A and A A C. e) False. (A c )c = (A S)c = (A)c = Ac . There is no set A such thatA = Ac .

12

Basics of Probability Theory f) True. Bi s are mutually exclusive and collectively exhaustive. 2. a) Starting with the left hand side we have: A (B C) = A (B C c ) = (A B) C c = (A B) C c = (A B) C. For the right hand side we have: (A B) (A C) = (A B) (A C)c = (A B) (Ac C c ) = (A B Ac ) (A B C c ) We also know that (A B Ac ) = ((A Ac ) B) = and (A B C c ) = ((A B) C c ) = (A B) C. As a result we have: (A B) (A C) = ((A B) C) = (A B) C. Then both sides are equal to (A B) C, and therefore the equality holds. b) A (A B) = A (A B)c = A (Ac B c ) = (A Ac ) (A B c ) We know that A Ac = . Thus we have: A (A B) = (A B c ) = A B c = A B 3. a) null set b)
A (A B) B

c)
S A (A B) B

4.

a) R1 R2 = {1, 2, 3, 4, 5, 6} b) R4 R5 = {1, 3}

1.11 Solutions for the Illustrated Problems


c c) R5 = {2, 4, 6}

13

d) (R1 R2 ) R3 = {2, 4, 6}
c e) R1 (R4 R5 ) = {1, 3, 4, 6}

f) (R1 (R2 R3 ))c = {1, 3, 4, 6}


c c g) ((R1 R2 ) (R4 R5 ))c = {3, 4, 5, 6}

h) One solution is {1,2,3} and {4,5,6} which partition S to two disjoint sets. 5. a) ((, 1) (4, ))c = [1, 4] b) [0, 1] [0.5, 2] = [0.5, 1] c) [1, 0] [0, 1] = [1, 1] 6. Try drawing Venn diagrams 7. A = {V V I, V V D, V V V, V IV, V DV, IV V, DV V } B = {DDD, DDI, DID, IDD} 8. a)
nA = 112+119+131 = 0.724 n 500 nB = 85+43 = 0.256 n 500 nC 10 = 500 = 0.02 n nABC = 500 = 1 n 500 nA nB C + n + nn = 0.724 + 0.256 n

b)

+ 0.02 = 1 =

nABC n

9.

a) S = {2 2 2, 2 2 3, 2 2 4, 2 3 3, 2 3 4, 2 4 4, 3 3 3, 3 3 4, 3 4 4, 4 4 4} b) There are 10 elements in S, thus the probability of each outcome is 1/10. To be mathematically rigorous, one can dene 10 mutually exclusive outcomes: E1 = {222}, E2 = {223}, . . ., E10 = {444}. These outcomes are also collectively exhaustive. Thus, using the second and the third axioms of probability, P [E1 ] + P [E2 ] + ...P [E10 ] = P [S] = 1. Now, since these outcomes are equally likely, each has P [Ei ] = 1/10. c) E = {2 2 2, 2 2 4, 2 4 4, 4 4 4}, T = {2 2 3, 2 2 4, 2 3 3, 2 4 4, 3 3 4, 3 4 4} Thus, P [E] = 4/10, P [T ] = 6/10 d, e) E T = {2 2 4, 2 4 4} E T = {2 2 2, 2 2 3, 2 2 4, 2 3 3, 2 4 4, 3 3 4, 3 4 4, 4 4 4} Thus, P [E T ] = 2/10, P [E T ] = 8/10.

14

Basics of Probability Theory f) It can be seen that P [E T ] = P [E] + P [T ]. This does not contradicts the third axiom, because the third axiom on only for mutually exclusive (in this case, disjoint) events. E and T are not disjoint.

10.

A = B c and C = Dc

a) P [B D] = P [C c Ac ] = P [(A C)c ] = 1 P [A C] P [A C] = 0.75 b) P [A C] = P [A] + P [C] P [A C] P [A] = P [A C] P [C] + P [A C] P [C] = 1 P [D] = 0.42 P [A] = 0.75 0.42 + 0.3 = 0.63

11.

a)

P [A B] = P [A] + P [B] P [A B] P [A B] P [A]+P [B] P [A B] 0 Notice that from a) it can easily be concluded that P [A B C ] P [A] + P [B] + P [C] + b) P [A B] = P [A] + P [B] P [A B] P [A B] 1 1 P [A B] P [A] + P [B] 1
}

P [A]+P [B]P [A B]

12.

a) We dene A to be the event that a random family has less than two cars and N to be number of cars. P [A S] = P [N = 0 S] + P [N = 1 S] = 0.04 + 0.14 = 0.18 P [A M ] = P [N = 0 M ] + P [N = 1 M ] = 0.02 + 0.33 = 0.35 P [A L] = P [N = 0 L] + P [N = 1 L] = 0.01 + 0.03 = 0.04 P [A] = P [A S] + P [A M ] + P [A L] = 0.18 + 0.35 + 0.04 = 0.57
P b) P [ L| N > 2] = P [L(N >2)] = P [L(N >2)]+P[L(N >2)] [S(N >2)] P [N >2] [M N >2]+P 0.03 P [ L| N > 2] = 0.03+0.02+0 = 0.6 P c) P [L|N < 2] = P [L(N <2)] = P [L(N <2)]+P[L(N <2)] [S(N <2)] P [N <2] [M N <2]+P 0.03+0.01 0.04 4 P [ L| N < 2] = (0.03+0.01)+(0.33+0.02)+(0.14+0.04) = 0.57 = 57 0.07 =

1.11 Solutions for the Illustrated Problems d)

15

P [M (N = 1)] P [(S L) (N = 1)] P [ N = 1| M ] = = ] P [S L] P [M P [S (N = 1)] + P [L (N = 1)] = P [S] + P [L] 0.14 + 0.03 = (0.04 + 0.14 + 0.02 + 0.00) + (0.01 + 0.03 + 0.13 + 0.03) 0.17 17 = = = 0.425 0.4 40 13.
P [ out=1|in=1]P [in=1] a) P [ in = 1| out = 1] = P [ out=1|in=1]P [in=1]+P [ out=1|in=0]P [in=0] 0.70.5 = 0.70.5+0.10.5 = 0.875 P [ out=X|in=1]P [in=1] b) P [ in = 1| out = X] = P [ out=X|in=1]P [in=1]+P [ out=X|in=0]P [in=0] 0.10.5 = 0.10.5+0.10.5 = 0.5 P [ out=X|in=0]P [in=0] P [ in = 0| out = X] = P [ out=X|in=1]P [in=1]+P [ out=X|in=0]P [in=0] 0.10.5 = 0.10.5+0.10.5 = 0.5 or P [ in = 0| out = X] = 1 P [ in = 1| out = X] = 1 0.5 = 0.5.

c) P [0] + P [1] = 1 3P [1] + P [1] = 1 P [1] = 0.25 P [ out=1|in=1]P [in=1] P [ in = 1| out = 1] = P [ out=1|in=1]P [in=1]+P [ out=1|in=0]P [in=0] 0.70.25 P [ in = 1| out = 1] = 0.70.25+0.10.75 = 0.7 14. First we dene some events as follows: C = the event that Jack is criminal L = the event that Jack is left handed. Now we use the Bayes rule and write P [ C| L] = P [L|C]P [C] = 1P [C] = P [C] P [L] P [L] P [L] P [L] = P [ L| C c ]P [C c ] + P [ L| C]P [C] = (0.2) (0.4) + (1) (0.6) = 0.68 0.6 P [ C| L] = P [C] = 0.68 0.88 P [L] 15. a) P [red] = P [ red| A]P [A] + P [ red| B]P [B] = b) P [ A| red] =
P [ red|A]P [A] P [red] 2 10

1+ 2

6 10

1 2

= 0.4

(0.2)(0.5) 0.4

= 0.25

c) Let A & B denote drawing the rst ball from urn A & B respectively. Then

16

Basics of Probability Theory

P [ A| green] = = P [ B| green] = = P [ red| green] = =

P [ green| A]P [A] P [ green| A]P [A] + P [ green| B]P [B] (0.3) (0.5) = 0.75 (0.3) (0.5) + (0.1) (0.5) P [ green| B]P [B] P [ green| A]P [A] + P [ green| B]P [B] (0.1) (0.5) = 0.25 (0.3) (0.5) + (0.1) (0.5) P [ red| A, green]P [ A| green] + P [ red| B, green]P [ B| green] ( ) ( ) 6 2 5 (0.75) + (0.25) = 11 11 11

16. Let us dene the event A to denote drawing a ball from the urn A. Similarly dene another event B for the urn B.
(

P [win] =

1 6 1 6( 7 2 6 + 856 6 7

2 8

6 7
1 2

1 2

) (

5 6

66 8 7

1 2

) (

1 8

1 7

1 2

) (

7 + 1 8 1 7

1 2

+0+

7 8

1 1 7

1 2

= 0.848

17.

a) P [T1 T2 T3 |b] = (0.75)3 = 0.422 b) P [T4 |b, T1 T2 T3 ] = 0.75 c) P [T4 |T1 T2 T3 ] = P [T4 |b, T1 T2 T3 ]P [b|T1 T2 T3 ]+P [T4 |f, T1 T2 T3 ]P [f |T1 T2 T3 ] P [T1 T2 T3 ] = P [T1 T2 T3 |b]P [b] + P [T1 T2 T3 |f ]P [f ] = 0.422 0.5 + 0.5 (0.5)3 = 0.2735 T P [b|T1 T2 T3 ] = P [T1[T21T32|b]P [b] = 0.4220.5 = 0.77 P T T3 ] 0.2735 P [f |T1 T2 T3 ] = 1 P [b|T1 T2 T3 ] = 0.23 P [T4 |T1 T2 T3 ] = 0.75 0.77 + 0.5 0.23 = 0.69 d) no, because if T T T happens the probability that the biased coin is chosen increases. e) yes.
(

18.

a)

7 3, 2, 1, 1

7! (3!)(2!)(1!)(1!)

= 420

b) For every arrangement that r appears before n, there is a counterpart where n appear before r (just interchange r and n). Thus in half of the arrangements r appears before n. The answer, therefore, is 420 = 210. 2 c) The middle letter can be t, r or n. If t, we have 6! If r (or n), we have (3!)(2!) = 60 arrangements. Total = 120 + 60 + 60 = 240.
6! 3!

= 120 arrangements.

1.11 Solutions for the Illustrated Problems

17

Figure 1.4: Tree Diagram for 16

d) We can think of it as _ X _ X _ X _ X _, where X represents other letters and _ represents a potential location for o (notice that this way consecutive os are avoided). There are 5 locations for o and we want to pick three of them. Since order does not matter, ( ) the total number of ways is 5 = 10. The other 4 letters (tmt 2 have a total of 4! = 12 arrangements among themselves to ll the X 2! locations. So the total will be 12 10 = 120.

18 19. a) (14)(16) 4 4 = (30) 8


10011820 5852925

Basics of Probability Theory = 0.31

b) P [all girl] =

(14)(16) 30031 8 0 = 5852925 = 0.000513 (30) 8 (14)(16) 0 P [all boy] = 30 8 = 112870 = 0.0022 5852925 (8) Therefore, P [one gender] = 0.0022 + 0.00051 = 0.00271

c)
( )( )
14 8 16 0

P [g > b] = =

( )( )
14 7 16 1

( )
30 8

( )( )
14 6 16 2

( )( )
14 5 16 3

3003 + 54912 + 360360 + 1121120 = 0.263 5852925

d) 20.

(28)(2) 6 2 = (30) 8

376740 5852925

= 0.064

a) We can look at this question as follows: from the 9 steps, 4 needs to be upward and 5 to be to the right. Therefore, out of 9 steps we want to pick 4 upward ones. We get, ( ) Number of paths = 9 = 126. 4 b) Number of paths from A to C (similar part a) is of paths from C to B is
( )
5 3

( )
4 2

and number

. Thus the number of all paths from A


( )
4 2

to B which pass through C is (4)(5) P [C] = 2 9 3 = 610 = 0.476. 126 (4) 21. a) if X = +2 HH 0 HT or T H +1 TT +2

( )
5 3

. So the required probability

if X = 2 HH 0 HT or T H -1 TT -2

S = {(+2, 0), (+2, +1), (+2, +2), (2, 0), (2, 1), (2, 2)} b) E = {+1, +2} c) {Y = 0} ={number of heads tossed was 2} d)

1.11 Solutions for the Illustrated Problems

19

(X,Y)

Probability

1/4 +2 1/2 1/2 1/4

HH HT or TH

(+2,0)

1/8

(+2,+1)

1/4

TT HH HT or TH 1/4 TT

(+2,+2)

1/8

1/2 -2

1/4 1/2

(-2,0)

1/8

(-2,-1)

1/4

(-2,-2)

1/8

e) P [+2, 0] = 1/8 P [2, 0] = 1/8 f) P [Y = 0] = 1/4 P [Y = 1] = 1/4

P [+2, +1] = 1/4 P [2, 1] = 1/4 P [Y = +1] = 1/4 P [Y = 1] = 1/8

P [+2, +2] = 1/8 P [2, 2] = 1/8 P [Y = +2] = 1/8

g) P [X = 2|Y = 0] = P [X=2,Y =0] = 1/2 1/4 Similarly, P [X = +2|Y = +1] = 1, P [X = +2|Y = +2] = 1, P [X = +2|Y = 1] = P [X = +2|Y = 2] = 0.

22.

a) P [Path 1 fails] = = = P [Path 2 fails] = = P [fail] = = P [(R1 fails) (R2 fails)] P [R1 fails] + P [R2 fails] P [(R1 fails) (R2 fails)] 0.05 + 0.05 (0.05) (0.05) = 0.0975 P [R3 fails] + P [R4 fails] P [(R3 fails) (R4 fails)] 0.08 + 0.08 (0.08) (0.08) = 0.1536 P [(Path 1 fails) (Path 2 fails)] (0.1536) (0.0975) = 0.014976

20

Basics of Probability Theory

1.12

Drill Problems

Section 1.1,1.2,1.3 and 1.4 - Set theory and Probability axioms


1. A 6-sided die is tossed once. Let the event A be dened A =outcome is a prime number. a) Write down the sample space S. b) The die is unbiased (i.e. all outcomes are equally likely). What is the probability P [A] of event A? c) Suppose that the die was biased such that: the outcomes 2, 3 and 4 are equally likely; and the outcome 1 is twice as likely as the others. What would have been the probability P [A] of event A? Ans a) S = {1, 2, 3, 4, 5, 6} b) P [A] = 0.5 c) P [A] =
3 7

2. An unbiased 4-sided die is tossed. Let the events A and B be dened as: A =outcome is a prime number and B = {4}. a) Find probabilities P [A] and P [B]. b) What is A B? Write down P [A B]. c) What does this imply about A and B? d) Find P [(A B)c ] Ans a) P [A] = 0.5, P [B] = 0.25 b) A B = , P [A B] = 0 c) mutually exclusive d) P [(A B)c ] = 1

Section 1.6 - Independence


3. An unbiased 4-sided die is tossed. Let the events A and B be dened as: A =outcome is a prime number and B =outcome is an even number. a) Find probabilities P [A] and P [B].

1.12 Drill Problems b) What is A B? Write down P [A B]. c) Are A and B mutually exclusive? d) Are A and B independent? Ans a) b) c) d) P [A] = 0.5, P [B] = 0.5 A B = {2}, P [A B] = 0.25 no yes

21

4. A pair of unbiased 6-sided dies (X and Y ) is tossed simultaneously. Let the events A and B be denoted as A: X yields 2 B: Y yields 2 a) Write down the sample space S. Note that each outcome is a pair (x, y), where x, y {1, 2, 3, 4, 5, 6}. b) Write A and B as sets of outcomes. Find corresponding probabilities P [A] and P [B]. c) What is A B? Write down P [A B]. d) What is P [A B]? e) Are A and B mutually exclusive? f) Are A and B independent? Ans a) S = {(1, 1), (1, 2), (1, 3), (1, 4), (1, 5), (1, 6), (2, 1), (2, 2), (2, 3), (2, 4), (2, 5), (2, 6), (3, 1), (3, 2), (3, 3), (3, 4), (3, 5), (3, 6), (4, 1), (4, 2), (4, 3), (4, 4), (4, 5), (4, 6), (5, 1), (5, 2), (5, 3), (5, 4), (5, 5), (5, 6), (6, 1), (6, 2), (6, 3), (6, 4), (6, 5), (6, 6)} b) A = {(2, 1), (2, 2), (2, 3), (2, 4)}, B = {(1, 2), (2, 2), (3, 2), (4, 2)}, 1 P [A] = 6 , P [B] = 1 6 1 c) A B = {(2, 2)}, P [A B] = 36 d) P [A B] = 11 36 e) no f) yes

22

Basics of Probability Theory

Section 1.5 - Conditional Probability


5. In a certain experiment, A, B, C, and D are events with probabilities P [A] = 1/4, P [B] = 1/8, P [C] = 5/8, and P [D] = 3/8. A and B are disjoint, while C and D are independent. Hint: Venn diagrams are helpful for problems like this. a) Find P [A B], P [A B], P [A B c ], and P [A B c ]. b) Are A and B independent? c) Find P [C D], P [C Dc ], and P [C c Dc ]. d) Are C c and Dc independent? e) Find P [A|B] and P [B|A]. f) Find P [C|D] and P [D|C]. g) Verify that P [C c |D] = 1 P [C] for this problem. Can you interpret its meaning? Ans a) P [A B] = 0, P [A B] = 0.375, P [A B c ] = 0.25, P [A B c ] = 0.875 b) no c) P [C D] = 0.2344, P [C Dc ] = 0.3906, P [C c Dc ] = 0.2344 d) yes e) P [A|B] = 0, P [B|A] = 0 f) P [C|D] = 0.625, P [D|C] = P [D] = 3/8 6. Let A be an arbitrary event. Events D, E and F form an event space. P [D] = 0.35 P [A|D] = 0.4 P [E] = 0.55 P [A|E] = 0.2 P [F ] = ? P [A|F ] = 0.3 a) Find P [F ] and P [A]. b) Find P [A D]. c) Use Bayes rule to compute P [D|A] and P [E|A]. d) Can you compute P [F |A] without using the Bayes rule? e) Compute P [Ac |D], P [Ac |E] and P [Ac |F ]. What is the axiom you had to use? f) Use Bayes rule to compute P [D|Ac ] and P [E|Ac ].

1.12 Drill Problems

23

g) What theorem(s) you need to compute P [Ac ]? Is the value in agreement with P [A] computed in question 10 part a). Ans a) P [F ] = 0.1, P [A] = 0.28 b) P [A D] = 0.14 c) P [D|A] = 0.5, P [E|A] = 0.3929 d) P [F |A] = 1 P [D|A] P [E|A] e) P [Ac |D] = 0.6, P [Ac |E] = 0.8, P [Ac |F ] = 0.7 f) P [D|Ac ] = 0.2917, P [E|Ac ] = 0.6111

Section 1.7 - Sequential experiments and tree diagrams


7. Tabulated below is the number of dierent electronic components contained in boxes B1 and B2 . capacitors 3 1 diodes 3 5

B1 B2

A box is chosen at random, then a component is selected at random from the box. The boxes are equally likely to be selected. The selection of electronic components from the chosen box is also equally likely. a) Draw a probability tree for the experiment. b) What is the probability that the component selected is a diode? c) Find the probability of selecting a capacitor from B1 . d) Suppose that the component selected is a capacitor. What is the probability that it came from B1 ? Ans b) 0.6667 c) 0.25 d) 0.75

24

Basics of Probability Theory 8. Consider the following scenario at the quality assurance division of a certain manufacturing plant. In each lot of 100 items produced, two items are tested; and the whole lot is rejected if either of the tested items is found to be defective. Outcome of each test is independent of the other tests. Let q be the probability of an item being defective. Suppose A denotes the event the lot under inspection is accepted; and k denotes the event the lot has k defective items, where k {0, . . . , 100}. a) Compute probability P [k] of having k defective items in a lot. b) Find the probability P [A k] that a lot with k defective items is accepted. Note: check whether your result for P [A k] is intuitive for both k = 0 and k = 99. c) What is the conditional probability P [A|k] of a lot being accepted given it has k defective items? Ans a) P [k] = b)
(
100 k

q k (1q)100k
{( )
98 k

q k (1q)100k , k {0,..,98} 0 , k {99, 100}


k k 1 100 1 99 , k {0,..,98} 0 , k {99, 100}

c)

{(

)(

9. In a binary digital communication channel the transmitter sends symbols {0, 1} over a noisy channel to the receiver. Channel introduced errors may make the symbol received to be dierent from what transmitted. Let Si = {the symbol i is sent} and Ri = {the symbol i is received}, where i {0, 1}. Relevant symbol and error probabilities are tabulated below. i 0 1 P [Si ] 0.6 0.4 P [R0 |Si ] 0.9 0.05

a) Draw corresponding probability tree. b) Find the probability that a symbol is received in error.

1.12 Drill Problems

25

c) Given that a zero" is received, what is the conditional probability that a zero" was sent? d) Given that a zero" is received, what is the conditional probability that a one" was sent? Ans b) 0.08 c) 0.9643 d) 0.0357 10. In a ternary digital communication channel the transmitter sends symbols {0, 1, 2} over a noisy channel to the receiver. Channel introduced errors may make the symbol received to be dierent from what transmitted. Let Si = {the symbol i is sent} and Ri = {the symbol i is received}, where i {0, 1, 2}. Relevant symbol and error probabilities are tabulated below. i 0 1 2 P [Si ] 0.6 0.3 0.1 P [R0 |Si ] 0.9 0.049 0.1 P [R1 |Si ] 0.05 0.95 0.1

a) Draw corresponding probability tree. b) Find the probability that a symbol is received in error. c) Given that a zero" is received, what is the conditional probability that a zero" was sent? d) Given that a zero" is received, what is the conditional probability that a one" was sent? e) Given that a zero" is received, what is the conditional probability that a two" was sent? Ans b) 0.095 c) 0.9563 d) 0.026 e) 0.0177

26

Basics of Probability Theory

11. In a binary digital communication channel the transmitter sends symbols {0, 1} over a noisy channel to the receiver. Channel introduced errors may make the symbol received to be dierent from what transmitted. Let Si = {the symbol i is sent} and Ri = {the symbol i is received}, where i {0, 1}. A block of two symbols are sent along the channel. Channel errors on dierent symbol periods can be deemed independent. Relevant symbol and error probabilities are tabulated below. i 0 1 P [Si ] 0.6 0.4 P [R0 |Si ] 0.9 0.05

a) Draw corresponding probability tree (two-stage). Note: a stage corresponds to a single transmitted symbol. b) Find the probability that the block is received in error. c) Given that 00 received, what is the conditional probability that a 00 was sent? d) Given that 00 received, what is the conditional probability that a 01 was sent? Ans b) 0.1536 c) 0.9298 d) 0.0344 12. A machine produces photo detectors in pairs. Tests show that the rst photo detector is acceptable with probability 0.6. When the rst photo detector is acceptable, the second photo detector is acceptable with probability 0.85. If the rst photo detector is defective, the second photo detector is acceptable with probability 0.35. Let Ai the event i-th photo detector is acceptable. a) Draw a suitable probability tree. b) Describe the event (Ac A2 ) (A1 Ac ) in words. Compute the 2 1 corresponding probability. c) What is the probability P [Ac Ac ] that both photo detectors in a pair 2 1 are defective? d) Compute the probability P [A1 |A2 ].

1.12 Drill Problems Ans b) P [(Ac A2 ) (A1 Ac )] = 0.74 1 2 c) P [Ac Ac ] = 0.49 1 2 d) P [A1 |A2 ] = 0.7846

27

Section 1.8 - Counting methods


13. A hospital ward contains 15 male and 20 female patients. Five patients are randomly chosen to receive a special treatment. Find the probability of choosing: a) at least one patient of each gender b) at least two patient of each gender c) all patients from the same gender d) a group where certain two male patients (say Tim and Joe) are not chosen at the same time Ans a) 0.943 b) 0.635 c) 0.057 d) 0.748 14. A bridge club has 12 members (six married couples). Four members are randomly selected to form the club executive. Find the probability that the executive consists of: a) two men and two women b) all men or all women c) no married couples d) at least two men Ans a) 0.4545 b) 0.0606 c) 0.0303 d) 0.7273

28

Basics of Probability Theory

Figure 1.5: a system that includes both series and parallel subsystems

Section 1.9 - Reliability


15. Figure 1.5 shows a system in a reliability study composed of series and parallel subsystems. The subsystems are independent. P [W1 ] = 0.91, P [W2 ] = 0.87, P [W3 ] = 0.50, and P [W4 ] = 0.75. What is the probability that the system operates successfully? Ans 0.974

Chapter 2 Discrete Random Variables


2.1 Denitions

Denition 2.1: A random variable (RV) consists of an experiment with a probability measure P [] dened on a sample space S and a function that assigns a real number to each outcome in the sample space of the experiment. Denition 2.2: X is a discrete RV if its range is a countable set: SX = {x1 , x2 , }. Further, X is a nite RV if its range is a nite set: SX = {x1 , x2 , . . . , xn }.

2.2

Probability Mass Function


The probability mass function (PMF) of the discrete RV X PX (a) = P [X = a].

Denition 2.3: is dened as

Theorem 2.1: For a discrete RV with PMF PX (x) and range SX , 1. For any x, PX (x) 0. 2.

xSX

PX (x) = 1.

xB

3. For any event B SX , P [B] =

PX (x).

30

Discrete Random Variables

2.3

Cumulative Distribution Function (CDF)

Denition 2.4: The cumulative distribution function (CDF) of a RV X is FX (r) = P [X r] where P [X r] is the probability that RV X is no larger than r. Theorem 2.2: For discrete RV X with SX = {x1 , x2 , }, x1 x2 FX () = 0, FX () = 1. If xj xi , FX (xj ) FX (xi ). For a SX and > 0, lim0 FX (a) FX (a ) = PX (a). FX (x) = FX (xi ) for all x such that xi x < xi+1 . For b a, FX (b) FX (a) = P [a < X b]

2.4

Families of Discrete RVs

Denition 2.5: X is Bernoulli(p) RV if the PMF of X has the form


1 p,

PX (x) =

p, 0,

x=0 , x=1 otherwise

with SX = {0, 1}. Denition 2.6: X is a Geometric(p) RV if the PMF of X has the form PX (x) =
p(1 p)x1 , 0,

x = 1, 2, . . . , otherwise

2.5 Averages Denition 2.7: X is Binomial(n, p) RV if the PMF of X has the form PX (x) =
( ) n px (1 p)nx ,
x

31

x = 0, 1, 2, . . . , n otherwise

0,

where 0 < p < 1 and n is an integer with n 1. Denition 2.8: X is Pascal(k, p) RV (also known as negative binomial RV) if the PMF of X has the form PX (x) =
( ) x1 pk (1 p)xk , 0,
k1

x = k, k + 1, k + 2, . . . otherwise

where 0 < p < 1 and k is an integer such that k 1. Denition 2.9: form X is Discrete Uniform(k, l) RV if the PMF of X has the

PX (x) =

0,

1 , lk+1

x = k, k + 1, k + 2, . . . , l , otherwise

where the parameters k and l are integers such that k < l. Denition 2.10: X is Poisson() RV if the PMF of X has the form PX (x) = where > 0.
x e , 0,
x!

x = 0, 1, 2, . . . , otherwise

2.5

Averages

Denition 2.11: A mode of X is a number xmod satisfying PX (xmod ) PX (x) for all x. Denition 2.12: A median of X is a number xmed satisfying P [X < xmed )] = P [X > xmed )].

32

Discrete Random Variables

Denition 2.13: The mean (aka expected value or expectation) of X is E[X] = X =

xSX

xPX (x).

Theorem 2.3: 1. If X Bernoulli(p), then E[X] = p. 2. If X Geometric(p), then E[X] = 1/p. 3. If X Poisson(), then E[X] = . 4. If X Binomial(n, p), then E[X] = np. 5. If X Pascal(k, p), then E[X] = k/p. 6. If X Discrete Uniform(k, l), then E[X] = (k + l)/2.

2.6

Function of a Random Variable

x:g(x)=y

Theorem 2.4: For a discrete RV X, the PMF of Y = g(X) is PY (y) = P [Y = y] = PX (x)

i.e., P [Y = y] is the sum of the probabilities of all the events X = x for which g(x) = y.

2.7

Expected Value of a Function of a Random Variable

Theorem 2.5: Given X with PMF PX (x) and Y = g(X), the expected value of Y is E[Y ] = Y = E[g(X)] = g(x)PX (x).
xSX

2.8 Variance and Standard Deviation

33

2.8

Variance and Standard Deviation


VAR[X] =

xSX

2 Denition 2.14: The variance of RV X is VAR[X] = X = E[(X X )2 ],

(x X )2 PX (x) 0.

Equivalently, the expected value of Y = (X X )2 is VAR [X].

Denition 2.15: The standard deviation of RV X is X = Theorem 2.6: VAR[X] = E[X 2 ] (E[X])2 Theorem 2.7: For any two constants a and b, VAR[aX + b] = a2 VAR[X] Theorem 2.8: 1. If X Bernoulli(p), then VAR[X] = p(1 p). 2. If X Geometric(p), then VAR[X] = (1 p)/p2 . 3. If X Binomial(n, p), then VAR[X] = np(1 p). 4. If X Pascal(k, p), then VAR[X] = k(1 p) . p2

VAR[X].

5. If X Poisson(), then VAR[X] = . 6. If X Discrete Uniform(k, l), then VAR[X] = Denition 2.16: For RV X, (a) The n-th moment is E[X n ] (b) The n-th central moment is E[(X X )n ]. (l k)(l k + 2) . 12

2.9

Conditional Probability Mass Function

Denition 2.17: Given the event B, with P [B] > 0, the conditional probability mass function of X is PX|B (x) = P [X = x|B].

34

Discrete Random Variables

P [X=x] P [X = x, B] P [B] , x B Theorem 2.9: For B SX , PX|B (x) = = . 0, P [B] otherwise

Denition 2.18: The conditional expected value of RV given condition is E[X|B] = X|B =

xB

xPX|B (x).

Theorem 2.10: The conditional expected value of Y = g(X) given condition B is E[Y |B] = Y |B = g(x)PX|B (x).
xB

2.10

Basics of Information Theory

Denition 2.19: The information content of any event A is dened as I(A) = log2 P [A] This denition is extended to a Random Variable X. Denition 2.20: The information content of X is dened as I(X) = E[log2 (P [X = x])] =

PX (x) log2 PX (x)

I(X) is measured in bits. Suppose X produces symbols s1 , s2 , ...sn . A binary code is used to represent the symbols Let li bits used represent si , for i = 1...n . Denition 2.21: The average length of the code is E[L] =

pi li

Denition 2.22: The eciency of the code is dened as = I(X) 100% E[L]

2.11 Illustrated Problems Theorem 2.11: Humans Algorithm 1. Write symbols in decreasing order with their probabilities. 2. Merge in pairs from the bottom and reorder. 3. Repeat until one symbol is left. 4. Code each branch with "1" or "0".

35

2.11

Illustrated Problems

1. Two transmitters send messages through bursts of radio signals to an antenna. During each time slot each transmitter sends a message with probability 1/2. Simultaneous transmissions result in loss of the messages. Let X be the number of time slots until the rst message gets through. Let Ai be the event that a message is transmitted successfully during the i-th time slot. a) Describe the underlying sample space S of this random experiment (in terms of Ai andAc ) and specify the probabilities of its outcomes. i b) Show the mapping from S to SX , the range of X. c) Find the probability mass function of X. d) Find the cumulative distribution function of X. 2. An experiment consists of tossing a fair coin until either three heads or two tails have appeared (not necessarily in a row). Let X be the number of tosses required. a) Describe the underlying sample space S of this random experiment using a tree diagram and specify the probabilities of its outcomes. b) Show the mapping from S to SX , the range of X. c) Find the probability mass function of X. d) Find the cumulative distribution function of X. 3. Ten balls numbered from 1 to 10 are in an urn. Four balls are to be chosen at random (equally likely) and without replacement. We dene a random variable X which is the maximum of the four drawn balls (e.g., if the drawn balls are numbered 3, 2, 8 and 6, then X = 8). a) What is the range of X, SX ? b) Find the PMF of X and plot it. c) Find the probability that X be greater than or equal to 7. 4. The Oilers and Sharks play a best out 7 playo series. The series ends as soon as one of the teams has won 4 games. Assume that Sharks (Oilers)

36

Discrete Random Variables are likely to win any game with a probability of 0.45(0.55) independently of any other game played. For n = 4, 5, 6, 7 dene events On = {Oilers win the series in n games} and Sn = {Sharks win the series in n games}. a) Suppose the total number of games played in the series is N . Describe the event {N = n} in terms of On and Sn and nd the PMF of N . b) Let W be the number of Oilers wins in the series. Now, express the events {W = n} for n = 0, 1, . . . , 4 in terms of On and Sn and nd the PMF of W . 5. The random variable X has PMF PX (x) =
k cx+1 ,
x2 +1

0,

x = 2, 1, 0, 1, 2 otherwise

a) Find the value of k and the range of c for which this is a valid PMF. b) For c = 0 and k found in part a, compute and plot the CDF of X. c) Compute the mean and the variance of X. 6. The CDF of a random variable is as follows a<1 1a<3 FX (a) = s, 3a<4 0.9, 4 a < 6 t, 6a a) What are the values of r and t and the valid range of s? b) What is P [2 < X 5]? c) Knowing that P [X = 3] = P [X = 4], Find s and plot the PMF of X. 7. Studies show that 20% of people are left handed. Also, it is known that 15% of people are allergic to dust. a) What is the probability that in a class of 40 students, exactly 8 students be left handed? b) Assuming that being left handed is independent of being allergic to dust, what is the probability that in a class of 30 students more than 2 students be both left handed and allergic to dust? c) To study a new allergy medicine, the goal is to select a group of 10 people that are allergic to dust. Randomly selected people are tested to check whether or not they are allergic to dust. What is the probability that after testing exactly 75 people, the needed group of 10 is found?
r, 0.3,

2.11 Illustrated Problems

37

8. A game is played with probability of win P [W ] = 0.4. If the player wins 10 times (not necessarily consecutive) before failing 3 times (not necessarily consecutive), a $100 award is given. What is the probability that the award is won? Hint: Identify all award-winning cases (10W, 10W + 1F, 10W + 2F ) and notice that all award-winning cases nish with a W . 9. Phone calls received on a cell phone are totally random in time. Therefore (as we proved in class), the number of telephone calls received in a 1 hour period is a Poisson random variable. If the average number of calls received during 1 hour is 2 (meaning that = 2) answer the following questions: a) What is the probability that exactly 2 calls are received during this one hour period? b) The cell phone is turned o for 15 minutes, what is the probability that no call is missed. c) What is the probability that exactly 2 calls are received during this one hour period and both calls are received in the rst 30 minutes? d) Find the standard deviation of the number of calls received in 15 minutes. 10. A stop-and-wait protocol is a simple network data transmission protocols in which both the sender and receiver participate. In its simplest form, this protocol is based on one sender and one receiver. The sender establishes the connection and sends data in packets. Each data packet is acknowledged by the receiver with an acknowledgement packet. If a negative acknowledgement arrives (i.e., the received packet contains errors), the sender retransmits the packet. Now consider the use of this protocol on a network with packet error rate 1/70 (acknowledgement packets are assumed to receive perfectly). Let X be the number of transmissions necessary to send one packet successfully. a) Find the probability mass function of X. b) Find the mean and variance of X. c) If successful transmission does not take place in 12 attempts, the sender declares a transmission failure. Find the probability of a transmission failure. d) Assume that 100 packets are to be transmitted. Let Y be the number of transmissions necessary to send all 100 packets. Find the probability mass function of Y . e) Find the mean and variance of Y . 11. Find the n-th moment and the n-th central moment of X Bernoulli(p).

38 12. The random variable X has PMF PX (x) = a) Compute FX (x). b) Compute E[X] and VAR [X].
c/(1 + x2 ), 0,

Discrete Random Variables

x = 3, 2, . . . , 3 otherwise

c) Consider the function Y = 2X 2 . Find PY (y). d) Compute E[Y ] and Var[Y ]. 13. Consider a source sending messages through a noisy binary symmetric channel (BSC); for example, a CD player reading from a scratched music CD, or a wireless cellphone capturing a weak signal from a relay tower that is too far away. For simplicity, assume that the message being sent is a sequence of 0s and 1s. The BSC parameter is p. That is, when a 0 is sent, the probability that a 0 is (correctly) received is p and the probability that a 1 is (incorrectly) received is 1 p. Likewise, when a 1 is sent, the probability that a 1 is (correctly) received is p and the probability that a 0 is (incorrectly) received is 1 p. Let p = 0.97 for the BSC. Suppose the all-zero byte (i.e. 8 zeros) is transmitted over this channel. Let X be the number of 1s in the received byte. a) Find the probability mass function of X. b) Compute E[X] and Var[X]. c) Suppose that in all transmitted bytes, the eighth bit is reserved for parity (even parity is set for the whole byte), so that the receiver can perform error detection. Let E be the event of an undetectable error. Describe E in terms of X. Find P [E]. 14. PX (x) =
c/(1 + x2 ), 0,

x = 3, 2, . . . , 3 otherwise

a) Dene event B = {X 0}. Compute PX|B (x). b) Compute FX|B (x). c) Compute E[X|B] and Var[X|B]. 15. Let X be a Binomial(8, 0.3) random variable. a) Find the standard deviation of X. b) Dene B={X is odd}. Find PX|B (x).

2.12 Solutions for the Illustrated Problems c) Find E[X|B].

39

d) Find Var[X|B].

2.12
1.

Solutions for the Illustrated Problems


a) Ai : one of them sends message at ith time slot, P [Ai ] = 1 + 1 = 1 4 4 2 Ac : both or none of them sends message at ith time slot, P [Ac ] = 1 i i 2 S = {A1 , Ac A2 , Ac Ac A3 , . . . , Ac Ac Ac An , . . .} 1 1 2 1 2 n1

b)
S

A1

A1c A2
2

c A1c A2 A3

c A1c A2

c An 1 An

SX
(1/2)t , 0,

c) PX (t) =

t {1, 2, . . .} otherwise

0,

d) FX (t) = or FX (t) =

1 0,
2

t<1 . . . + 1 + + 4
1 2n1

=1

1 , 2n1

n 1 tt < n

F (t 1) + X

( )n1
1 2

t<1 , n1t<n

2.

a)

40

Discrete Random Variables

H 1/2 H 1/2 H 1/2 1/2 T 1/2 1/2 T 1/2 T 1/2 1/2 T H 1/2 1/2 H 1/2 1/2 H 1/2 1/2 1/2 T 1/2 1/2

P [HHH] = 1/8, P [HHT H] = P [HT HH] = P [HT HT ] = 1/16, P [HT T ] = 1/8 P [T HHH] = P [T HHT ] = 1/16, P [T HT ] = 1/8, P [T T ] = 1/4, P [HHT T ] = 1/16 b)
S
TT HHH HTT THT HHTH, HTHH, HTHT, HHTT, THHH, THHT

SX
1/4, 3/8,

t=2 t=3 c) PX (t) = 3/8, t = 4 0, otherwise t<2 2t<3 d) FX (t) = 5/8, 3 t < 4 1, t4 3. a) SX = {4, 5, 6, 7, 8, 9, 10}
0, 1/4,

2.12 Solutions for the Illustrated Problems

41

b) The probability that x = n means one of these four balls is n and the other three are chosen form n 1 balls with number less than n.
(

P [X = n] =

n1 1 3 1 ( ) 10 4

)( )

(n 1).(n 2).(n 3) 1260


432 1260 654 1260 876 1260

P [X P [X P [X P [X

1 P [X = 5] = = 4] = 210 = 0.0048 543 = 6] = 1260 = 0.048 P [X = 7] = = 8] = 765 = 0.167 P [X = 9] = 1260 = 10] = 987 = 0.4 1260

= 0.019 = 0.095 = 0.267

c) P [X 7] = 0.095 + 0.167 + 0.267 + 0.4 = 0.929 4. a) {N = n} is the event that the series ends in n games. This means either Sn or On occurs: {in n 1 games, Sharks win 3 times (and Oilers win n 4 times) and in the nth game, Sharks win} or {in n 1 games, Oilers win 3 times (and Sharks win n 4 times) and in the nth game, Oilers win}. {N = n} = On ) n S ( ( ) n1 P [N = n] = 3 (0.45)4 (0.55)n4 + n1 (0.45)n4 (0.55)4 3 b) {W = 0} = S4 , {W = 1} = S5 , {W = 2} = S6 , {W = 4} = O4 + O5 + O6 + O7 P [W = 0] = ( ) 4 = 0.041 (0.45) P [W = 1] = 4 (0.45)4 (0.55) = 0.09 3 P [W = 2] =
( )
5

{W = 3} = S7

P [W = 3] = 6 (0.45)4 (0.55)3 = 0.136 3 P [W = 4] = 0.608 5. a)

(3)

(0.45)4 (0.55)2 = 0.124

PX (x) = 1 k

12c 5

1c 2

+1+

1+c 2

1+2c 5

=1k=

5 12

PX (2) 0 1 2c 0 c 0.5 PX (2) 0 1 + 2c 0 c 0.5 thus, 0.5 c 0.5 b)


0, 1/12,

x < 2 x = 2 7/24, x = 1 FX (x) = 17/24, x = 0 22/24, x = 1 1, x2

42 c) With c = and k 0 1/12, 5/24, 5/12, PX (x) = 5/24, 1/12, 0, = 5/12 we have x = 2 x = 1 x=0 x=1 x=2 otherwise

Discrete Random Variables

Therefore, 1 5 5 5 1 E[X] = 12 (2) + 24 (1) + 12 (0) + 24 (+1) + 12 (+2) = 0 and 1 5 5 VAR[X] = E[(X 0)2 ] = E[X 2 ] = 12 (2)2 + 24 (1)2 + 12 (0) + 5 1 13 2 2 (+1) + 12 (+2) = 12 24 6. a) r = 0, t = 1, 0.3 s 0.9 Recall that FX () = 0, FX () = 1 and that FX (a) is non-decreasing. b) P [a < X b] = FX (b) FX (a) P [2 < X 5] = FX (5) FX (2) = 0.9 0.3 = 0.6 c) P [X = 3] = lim (FX (3) FX (3 )) = s 0.3 0 lim P [X = 4] = (FX (4) FX (4 )) = 0.9 s 0 s 0.3 = 0.9 s s = 0.6 0, a<1 0.3, 1 a < 3 0.3, t {1, 3, 4} FX (a) = 0.6, 3 a < 4 PX (t) = 0.1, t = 6 0.9, 4 a < 6 0, otherwise 1, 6a Notice that t PX (t) = 1
( )
40 8

7.

a) X Binomial(40, 0.2) P [X = 8] =

(0.2)8 (0.8)32

b) P [both] = 0.2 0.15 = 0.03 Y Binomial(30, 0.03) P [Y > 2] = 1 P [Y = 0] P [Y = 1] ( ) ( ) where P [Y = 0] = 30 (0.97)30 (0.03)0 , P [Y = 1] = 30 (0.97)29 (0.03)1 0 0 c) The last person tested is allergic (since the group is formed and no need for more tests) Z Pascal(10, 0.15). ( ) P [Z = 75] = 74 (0.15)10 (1 0.15)65 9 8. Award-winning cases all end with W and thus can be modeled with Pascal(10, 0.4).

2.12 Solutions for the Illustrated Problems 10W 10W, 1F 10W, 2F X = 10


( )
9

43 =a

X = 11 X = 12

10 (0.4)10 (0.6)1 (9) 11 (0.4)10 (0.6)2 9

( 9)

(0.4)10 (0.6)0

=b =c P [$100] = a + b + c

9.

a) P [X = 2] = e2 2 = 0.27 2!
2

2 b) = 60 (average per minute) for 15 minutes = 15 = 0.5 0 P [X = 0] = e0.5 (0.5) = 0.6 0!

c) P [2 in rst 30 & 0 in ( in ( ) second 30] = P [2 ) rst 30]P [0 in second 30] 2 2 2 0 2 (30 2 (30 60 ) 60 ) = e30 60 e30 60 = 0.068. 2! 0! Notice that for 30 minutes: = 30
2 . 60

d) For 15 minutes we saw that = 0.5. We also know that for Poisson RV VAR = . Thus, std = VAR[X] = 0.5 = 0.71. 10. a) The probability that X = n is the probability that the rst n 1 transmission were unsuccessful and the nth transmission is successful [Geometric RV with probability of success p = 69/70]. ( )n1 ( ) 1 P [X = n] = 70 69 70 b) X is a Geometric RV, E [X] = (1 p)/p2 = 0.0147.
( )
1 p

70 69

= 1.014 and VAR [X] =


( ) ( ( 12

c) P [failure] = P [X > 12] = 1 P [X 12] = 1 =1


(
69 70

69 70

1 1( 70 )

12

1 1( 70 )

1 70

)12

n=1

1 70

)n1 )

= 7.2 1012

Alternative solution: ( ) ( ( )n1 ) ( ) ( ( )m+12 ) 1 1 = 69 P [failure] = P [X > 12] = 69 70 70 70 70 =


(
69 70

) (

1 70

)12 ( (
m

1 70

)m )

n=13

69 70

) (

1 70

)12

1 1 1( 70 )

1 70

m=0 )12

Without detailed derivation, it could be easily argued that the solution ( )12 1 is 70 . How? d) The probability that Y = n n 100 is the probability that in the rst n 1 transmissions, only 99 of them were successful and also the nth transmission is also successful [In other words, Y is a Pascal(100, 69/70) RV]. Therefore: ( ) ( )100 ( )n100 69 1 P [Y = n] = n1 70 70 99

44

Discrete Random Variables e) Y is a Pascal random variable. Thus, E[Y ] = VAR [Y ] = k(1 p)/p2 = 1.47.
k p

100

( 69 ) 70

= 101.45 and

11. E[X n ] = 1n p + 0n q = p E[(X )n ] = E[(X p)n ] = (1 p)n p + (p)n q


3 x=3

12.

a)

c 1+x2

=1c=
1, 26 2 , 26 5 , 26

5 13

so,

PX (x) =

26 5 , 26 2 , 26 1 ,
26 3

10

x = 3 x = 2 x = 1 x=0 x=1 x=2 x=3


(
x 1+x2

x < 3 3 x < 2 2 x < 1 1 x < 0 FX (x) = 13 9 , 0x<1 13 23 , 1x<2 26 25 , 2x<3 26 1, x3

0, 1 , 26 3 , 26 4 ,

b) E[X] =

x=3

5 13

=0
3 x=3 5 13

VAR[X] = E[X 2 ] 0 =
1, 13 2 ,

x2 1+x2

22 13

= 1.6923

y = 18 y=8 c) PY (y) = 13 5 13 , y {0, 2} 0, otherwise


16 d) E[Y ] = 18 + 13 + 10 = 44 = 3.3846 13 13 13 2 2 2 2 5 E[Y 2 ] = 18 + 8 13 + 2 13 = 472 13 13 VAR[Y ] = E[Y 2 ] E[Y ]2 = 24.852

13.

a) PX (x) =

( ) 8 (0.03)x (0.97)8x , 0,
x

x = 0, 1, . . . , 8

otherwise It is Binomial distribution with n = 8, p = 0.03.


8 x=0

b) E[X] =

xPX (x) = np = 8 0.03 = 0.24

VAR[X] = npq = 8 0.03 0.97 = 0.2328 c) E = {X is even and X = 0} = {undetectable error} P [E] = PX (2) + PX (4) + PX (6) + PX (8) = 0.02104

2.12 Solutions for the Illustrated Problems 14. a) P (B) =


5 13

45

x=0 x=1 1 PX|B (x) = 9 , x = 2 1 , x=3 18 0, otherwise x<0 0x<1 5 b) FX|B (x) = 6 , 1 x < 2 17 , 2x<3 18 1, x 3 c) E[X|B] = E[X 2 |B]
3 5 + 2 + 18 = 2 18 9 3 5 4 9 = 18 + 9 + 18 = 11 9

5, 9 5 , 18

1+ 1 + 1 + 2 5

1 10

9 13

0, 5 , 9

VAR[X|B] =

11 9

( )2
2 3

7 9

15.

a) E[X] = np = 0.3 8 = 2.4 (recall: E[Bionomial(n, p)] = np) VAR[X] = np(1p) = 80.30.7 = 1.68 (recall: VAR[Bionomial(n, p)] = np(1 p)) X = VAR[X] = 1.3 b) P [X P [X P [X P [X P [X P [X P [X P [X P [X = 0|B] = 1|B] = 2|B] = 3|B] = 4|B] = 5|B] = 6|B] = 7|B] = 8|B]

=0 = P [X]P [B|X] = P [B] =0 = 0.508 =0 = 0.094 =0 = 0.002 =0

0.1981 0.198+0.254+0.047+0.001

= 0.396

c) E[X|B] = 2.4

kP [X = k|B] = 10.396+30.508+50.094+70.002 =

d) E[X 2 |B] =

k 2 P [X = k|B] = 12 0.396 + 32 0.508 + 52 0.094 +

72 0.002 = 7.42 VAR [X|B] = E[X 2 |B] (E[X|B])2 = 7.42 2.42 = 1.64

46

Discrete Random Variables

2.13

Drill Problems

Section 2.1,2.2 and 2.3 - PMFs and CDFs


1. The discrete random variable K has the following PMF. k=0 2b k = 1 PK (k) = 3b k = 2 0 otherwise a) What is the value of b? b) Determine the values of (i) P [K < 2] (ii) P [K 2] (iii) P [0 < K < 2]. c) Determine the CDF of K. Ans
k<0 0 1/6 0 k < 1 c) FK [k] = 1/2 1 k < 2 b

a) 1/6

b) (i) 1/2 (ii) 1 (iii) 1/3

k2

2. The random variable N has PMF,


{

PN (n) =

c 2n

n = 0, 1, 2 . otherwise

a) What is the value of the constant c? b) What is P [N 1]? c) Find P [N 1|N 2]. d) Compute the CDF. Ans
n<0 0 4/7 0 n < 1 [d) FN [n] = 6/7 1 n < 2

a) 4/7

b] 6/7 c) 6/7

n2

3. The discrete random variable X has PMF,


{

PX (x) =

c/x x = 2, 4, 8 . 0 otherwise

2.13 Drill Problems a) What is the value of the constant c? b) What is P [X = 4]? c) What is P [X < 4]? d) What is P [3 X 9]? e) Compute the CDF of X. f) Compute the mean E[X] and the variance VAR[X] of X. Ans
x<2 0 4/7 2 x < 4 [e) FX [x] = 6/7 4 x < 8

47

a) 8/7 b) 2/7 c) 4/7 d) 3/7 f) E[X] = 24/7,VAR[X] = 208/49

x8

Section 2.4 and 2.5 - Families of Discrete RVs and Averages


4. A student got a summer job at a bank, and his assignment was to model the number of customers who arrive at the bank. The student observed that the number of customers K that arrive over a given hour had the PMF,
{

PK (k) =

k e k!

k {0, 1, . . .} otherwise

a) Show that PK (k) is a proper PMF. What is the name of this RV? b) What is P [K > 1]? c) What is P [2 K 4]? d) Compute E[K] and VAR[K] of K. Ans [a) Poisson() b) 1 e e d) E[K] = VAR[K] = c)
(
2 2

3 6

4 24

5. Let X be the random variable that denotes the number of times we roll a fair die until the rst time the number 5 appears. a) Derive the PMF of X. Identify this random variable. b) Obtain the CDF of X.

48

Discrete Random Variables c) Compute the mean E[X] and the variance VAR[X]. Ans
{

a) PX (x) =

5x1 6x

5 x1 b) FX (x) = 1 6 0 otherwise c) E[K] = 6, VAR[X] = 30

( )x

x = 1, 2, . . . Geometric(1/6) otherwise

6. Let X be the random variable that denotes the number of times we roll a fair die until the rst time the number 3 or 5 appears. a) Derive the PMF of X. Identify this random variable. b) Obtain the CDF of X. c) Compute the mean E[X] and the variance VAR[X]. Ans
{

a) PX (x) =

2x1 3x

b) FX (x) = 1 0

( )x
2 3

x = 1, 2, . . . Geometric(1/3) otherwise x1 otherwise [c) E[K] = 3, VAR[X] = 6

7. A random variable K has the PMF


( )

PK (k) =

5 (0.1)k (0.9)5k k

, k {0, 1, 2, 3, 4, 5}.

Obtain the values of: (i) P [K = 1] (ii) P [K 1] (iii) P [K 4|K 2]. Ans i) 0.32805 ii) 0.40951 iii) 5.647103 8. The number of N of calls arriving at a switchboard during a period of one hour is Poisson with = 10. In other words,
{

PN (n) =

10n e10 n!

n {0, 1, . . .} , otherwise

a) What is the probability that at least two calls arrive within one hour? b) What is the probability that at most three calls arrive within one hour?

2.13 Drill Problems

49

c) What is the probability that the number of calls that arrive within one hour is greater than three but less than or equal to six? Ans a) 0.9995 b) 0.0103 c) 0.1198 9. Prove that the function P (x) is a legitimate PMF of a discrete random variable, where P (x) is dened by
{

P (x) =

2 3

( )x
1 3

x {0, 1, . . .} . otherwise

Calculate the mode, expected value and the variance of this random variable. Ans mode = 0, expected value = 1/2, variance = 3/4

10. A recruiter needs to hire 10 chefs. He visits NAIT rst and interviews only 10 students; because of high demand he cant get more students to sign up for an interview. He knows that the probability of hiring any given NAIT chef is 0.4. He then goes to SAIT and keeps interviewing until his quota is lled. At SAIT the probability of success on any given interview is 0.8, and plenty of students are looking for jobs. Let X be the number of chefs hired at NAIT, Y the number hired at SAIT, and N = the number of interviews required to ll his quota. a) Find PX (x) b) Find E[X] c) Find E[Y ] d) Find E[N ] Ans a) B10 (x, 0.4) b) 4.0 c) 6.0 d) 17.5

Section 2.6 - Function of a RV


11. The discrete random variable X has the following PMF. k=0 2b k = 1 PX (k) = 3b k = 2 0 otherwise
b

50 a) What is the value of b?

Discrete Random Variables

b) Let Y = X 2 . Determine the PMF of Y . Determine the CDF of Y . c) Let Z = sin( X). Determine the PMF of Z. Determine the CDF of 2 Z. Ans k<0 1/3 k = 1 1/6 0 k < 1 a) 1/6 b) PY (k) = , and FY (k) = . 1/2 k = 4 1/2 1 k < 4 0 otherwise 1 k4 2/3 k = 0 0 k<0 c) PZ (k) = 1/3 k = 1 , and FZ (k) = 2/3 0 k < 1 . 0 otherwise 1 k1
1/6 k = 0 0

Section 2.7 and 2.8 - Expected value and Standard deviation of a function of RVs
12. Consider discrete random variable K dened in Problem 1. a) Compute the mean E[K] and the variance VAR[K]. b) Suppose another discrete random variable N is dened as: N = K 1. Compute its PMF and CDF. What is E[N ] and VAR[N ]? Compute E[N 3 ] and E[N 4 ] c) Suppose N is redened as: N = (K 1)2 . Repeat the computations of part b). Ans a) E[K] = 4/3, VAR[K] = 5/9 1/6 n = 1 1/3 n = 0 b) PN (n) = , 1/2 n = 1 0 otherwise n < 1 1/6 1 n < 0 FN (n) = , 1/2 0 n < 1 1 n1 E[N 3 ] = 1/3, E[N 4 ] = 2/3.
0

E[N ] = 1/3, VAR[N ] = 5/9,

2.13 Drill Problems n<0 c) PN (n) = 2/3 n = 1 , FN (n) = 1/3 0 n < 1 , 0 otherwise 1 n1 3 4 E[N ] = 2/3, VAR[N ] = 2/9, E[N ] = 2/3, E[N ] = 2/3. 13. Consider discrete random variable N dened in Problem 2. a) Compute the mean E[N ] and the variance VAR[N ].
1/3 n = 0 0

51

b) Suppose another discrete random variable K is dened as: K = N 2 + 3N . Compute E[K]. c) Suppose M = K N . Find E[M ]. Ans a) E[N ] = 4/7, VAR[N ] = 26/49 b) E[K] = 18/7 c) E[M ] = 2

Section 2.9 - Conditional PMFs


14. The discrete random variable K has the following PMF.
k=0 b 2b k = 1 PK (k) = 3b k = 2

otherwise

a) What is the value of b? b) Let B = {K < 2}. Determine the values of P [B] c) Determine the conditional PMF PK|B (k). d) Determine the conditional mean and variance of K given B. Ans
1/3 k = 0

a) 1/6 b) 1/2 c) PK|B (k) =

2/3 k = 1 0 otherwise d) E[K|B] = 2/3, VAR[K|B] = 2/9


15. Let X is a Geometric(0.5) RV. a) Find E[X|X > 3] b) Find VAR[X|X > 3]

52 Ans a) 5

Discrete Random Variables

b) 3

16. An exam has ve problems in it, each worth 20 points. Let N be the number of problems a student answers correctly (no partial credit). The PMF of N is PN (0) = 0.05, PN (1) = 0.10,PN (2) = 0.35,PN (3) = 0.25,PN (4) = 0.15,PN (5) = 0.1, zow.

a) Express the total mark G as a function of N . b) Find the PMF of G. c) What is the expected value of G, given that the student answered at least one question correctly? d) What is the variance of G, given that the student answered at least one question correctly? e) What is the probability that the students mark is greater than the mean plus or minus half the standard deviation, all with the condition that the student answered at least one question correctly? Ans a) G = 20N

b) PG (20x) = PN (x) for x = 0, 1, 2, 3, 4, 5 c) 55.8 d) 530

17. You rent a car from the Fly-by-night car rental company. Let M represent the distance in miles beyond 100 miles that you will be able to drive before the car breaks down. If the car has a good engine, denoted as event G, then M is Geometric(0.03). Otherwise it is Geometric(0.1). Assume further that P [G] = 0.6. a) What is the PMF of M , given that the engine is bad? What is E[M ] and VAR[M ] in this case ? b) What is the PMF of M generally? c) What is the probability of the successful completion of a trip of 120 miles without the engine failure? d) What is your expected distance to travel before engine failure? Ans a) PM (m) = 0.1 (0.9)m1 , E[M ] = 10, V AR[M ] = 90 0.018 0.97m1 c)0.3904 d) 124 miles

b) 0.04 0.9m1 +

2.13 Drill Problems

53

Section 2.10 - Basics of Information Theory


18. An source outputs independent symbols A,B and C with probabilities 16/20, 3/20 and 1/20 respectively; 100 such symbols are output per second. Consider a noiseless binary channel with a capacity of 100 bits per second. Design a Human code and nd the probabilities of the binary digits produced. Find the eciency of the code. 19. Construct a Human code for ve symbols with probabilities 1/2, 1/4, 1/8, 1/16, 1/16. Show that the average length is equal to the source information. Ans 1.875 20. The types and numbers of vehicles passing a point in a road are to be recorded. A binary code is to be assigned to each type of vehicle and the appropriate code recorded on the passage of that type. The average numbers of vehicles per hour are as follows: Cars : 500 Motorcycles : 50 Buses : 25 50 Vans : 100 Cycles : 50 Others : 25 Lorries : 200 Mopeds :

Design a Human code. Find its eciency and compare it with that of a simple equal-length binary code. Comment on the feasibility and usefulness of this system.

Chapter 3 Continuous Random Variables

3.1

Cumulative Distribution Function

Denition 3.1: The cumulative distribution function (CDF) of random variable (RV) X is FX (x) = P [X x] Note: if there is no confusion, the subscript can be dropped - F (x).

Theorem 3.1: For any RV X, the CDF satises the following: 1. FX () = 0 2. FX () = 1 3. if a < b, FX (a) FX (b) 4. P [a < X b] = FX (b) FX (a)

Denition 3.2: X is a continuous RV if the CDF is a continuous function. Then P [X = x] = 0 for any x.

56

Continuous Random Variables

3.2

Probability Density Function


The probability density function (PDF) of a continuous RV fX (x) =

Denition 3.3: X is

d FX (x) dx Note: if there is no confusion, the subscript can be dropped - f (x).

Theorem 3.2: For a continuous RV X with PDF fX (x) 1. fX (x) 0 2.

fX (x) dx = 1
x

3. FX (x) =

fX (t) dt
b
a

4. P [a < X b] =

fX (x) dx

The range of X is dened as SX = {x|fX (x) > 0}.

3.3

Expected Values

Expected values of X and g(X) are important. Denition 3.4: Expected value of a random variable is dened as

X = E[X] =

xfX (x) dx.

Theorem 3.3: The expected value of Y = g(X) is

E[Y ] = E[g(X)] =

g(x)fX (x) dx.

You can use this theorem to calculate the moments and the variance. Denition 3.5: If Y = g(X) = (X X )2 , then E[Y ] measures the spread of X around X . VAR[X] =
2 X

= E[(X X ) ] =
2

(x E[X])2 fX (x) dx.

3.4 Families of Continuous Random Variables Theorem 3.4: The variance can alternatively be calculated as VAR[X] = E[X ] E [X] =
2 2

57

x2 fX (x) dx 2 . X

Theorem 3.5: For any two constants a and b, VAR[aX + b] = a2 VAR[X]

3.4

Families of Continuous Random Variables

These are several important practical RVs. Denition 3.6: X is a uniform (a, b) RV if the PDF of X is

fX (x) = where b > a.

0,

1 , ba

ax<b otherwise

Theorem 3.6: If X is a Uniform(a, b) RV, then


0,

1. CDF is FX (x) =

xa , ba

1,

xa a < x b. x>b

2. E[X] =

a+b . 2 (ba)2 . 12

3. VAR [X] =

Denition 3.7: X is an Exponential() RV if the PDF of X is fX (x) = where > 0.


ex , 0,

x0 otherwise

58

Continuous Random Variables

Theorem 3.7: If X is an Exponential() RV, then 1. FX (x) =


1 ex , 0,

x0 otherwise
1 . 2

1 2. E[X] = . VAR [X] =

Denition 3.8: X is an Erlang(n, ) RV if its PDF is given by fX (x) =


n x n1 e x , 0,
(n1)!

x0 x<0

The Erlang (n, ) RV can be viewed as sum on n independent exponential () RVs. Theorem 3.8: If X is an Erlang(n, ) RV, 1. FX (x) =
( 1 ex n1 0,
(x)j j=0 j!

, x0 otherwise

2. E[X] = n , VAR [X] =

n . 2

3.5

Gaussian Random Variables

Denition 3.9: X is a Gaussian (, 2 ) or N (, 2 ) random variable if the PDF of X is (x)2 1 fX (x) = e 22 2 2 where the parameter can be any real number and > 0. Theorem 3.9: If X is a N (, 2 ) RV, E[X] = and VAR [X] = 2 . Denition 3.10: The standard normal random variable Z is the N (0, 1) RV. The CDF is z u2 1 e 2 du = (z). FZ (z) = P [Z z] = 2 Theorem 3.10: If X is a N (, 2 ) RV, then Y = aX + b is N (a + b, a2 2 ).

3.6 Functions of Random Variables Theorem 3.11: If X is a N (, 2 ) RV, the CDF of X is x FX (x) = The probability that X is in the interval is a X b P [a < X b] = P < ( ) ( ) b a =
[ ] ( )

59

The tabled values of (z) are used to nd FX (x). Note that (z) = 1 (z).

3.6

Functions of Random Variables

General Idea: If Y = g(X), then nd the PDF of Y from the PDF of X. The CDF method involves two steps: 1. The CDF of Y is obtained by using FY (y) = P [Y y] = P [g(X) y]. 2. The PDF of Y is given by fY (y) = The PDF method is as follows: 1. The PDF of Y is obtained by using fY (y) = fX (x) |g (x)| dFY (y) . dy

x=g 1 (y)

where g (x) is the derivative of g(x). 2. If x = g 1 (y) is multi-valued, then the sum is over all solutions of y = g(x). Theorem 3.12: If Y = aX + b, then the PDF of Y is
(

fY (y) = fX

1 yb a |a|

60

Continuous Random Variables

Theorem 3.13: Let X Uniform(0,1) and let F (x) denote CDF with an inverse F 1 (u) dened for 0 < u < 1. The RV Y = F 1 (X) has CDF F (y).

3.7

Conditioning a Continuous RV

Denition 3.11: For X with PDF fX (x) and event B SX with P [B] > 0, the conditional PDF of X given B is fX|B (x) = P [B] 0,
fX (x) ,

xB otherwise

Theorem 3.14: The conditional expected value of X is

X|B = E[X|B] =

xfX|B (x)dx.

The conditional expected value of g(X) is

E[g(X)|B] = The conditional variance is

g(x)fX|B (x)dx.

VAR[X|B] = E (X X|B )2 |B = E X 2 |B 2 . X|B We use this theorem to calculate the conditional moments and the variance.

3.8

Illustrated Problems

1. X is a continuous random variable. Test if each of the following function could be a valid CDF. Provide brief details on how you tested each. x < 1 a) F (x) = x2 , |x| 1 1, 1 < x x<0 b) F (x) = , 0x1 1, 1 < x
x 2

0,

0, 2

3.8 Illustrated Problems x<0 c) F (x) = sin(x), 0 x 1, <x 2 d) F (x) =


0, 0,

61

x 4 1 exp(a(x + 4)), 4 < x

For the valid CDFs, derive their PDFs. 2. The CDF of a random variable Y is
0,

FY (y) =

(y+2) , 3

1,

y < 2 2 y 1 y>1

a) Find the PDF of this random variable. b) Find P [Y < 0]. c) Find the expected value of Y . d) Find the variance of Y . e) Find the median of Y (i.e., nd a such that P [Y < a] = P [Y > a]). 3. Consider the following PDF. fX (x) =
3 4x,

0,

0<x<a otherwise

a) Find a to have a valid PDF. b) Find E [X 3 ]. 4. The random variable X has PDF 0<x2 fX (x) = cx + 1, 2 < x 4 0, otherwise a) Find c to have a valid PDF. b) Find the CDF of X and plot it. c) Find P [1 < X < 3] (It is a good practice to nd this value from both the PDF and CDF to compare both methods). 5. The waiting time at a bank teller is modeled as an exponential random variable and the average waiting time is measured to be 2 minutes (in other words E[T ] = 2).
1/4,

62 a) What is the value of ?

Continuous Random Variables

b) What is the probability of waiting more than 5 minutes? c) What is the probability of waiting more than 8 minutes (in total) given that you have already waited 5 minutes? d) Find a such that in 90% of cases the waiting time is less than a. (Hint: In other words nd a such that P [T < a] = 0.9.) e) Given that the waiting time is less than 5 minutes, what is the probability that it is also less than 3 minutes? 6. Jim has started an internet le sharing website with 3 le servers A, B and C. The le requests are handled by these 3 servers in the order of A, B, C. Thus server A will service requests number 1, 4, 7, . . .; server B will service requests number 2, 5, 8, . . . and server C will service requests 3, 6, 9, . . .. Servicing each request takes 10ms. The time interval T between requests that are handled by any given server is an Erlang(3, ) random variable, where 1/ = 2ms. a) Write down and expand the CDF of this Erlang random variable. b) Find the probability of T > 10ms (meaning that a server is not busy when it received its next request). c) Find the expected value and the variance of T . d) For maintenance, server C is taken out of service and the trac is handled by servers A and B. Find the probability of T > 10ms in this case. [Hint: since only two servers work, T is an Erlang(2, ) R.V. with as before.] 7. A study shows that the heights of Canadian men are independent Gaussian random variables with mean 175cm and standard deviation 10cm. Use the table on page 123 of the textbook to answer the following questions: a) What percentage of men are at least 165 cm? b) What is the probability that a randomly chosen man is shorter than 160 cm or taller than 195 cm c) A similar study shows that the heights of Canadian women are also independent Gaussian random variables with mean 165 cm. It also shows that 91% of the women are shorter than 177 cm. What is the variance of the heights of Canadian women? d) What percentage of women are at least 165 cm? Compare with part a). e) [Just read this part] A random man and a random woman are selected. What is the probability that the woman be taller than the man? [Think about this question, and try to realize why we cannot solve it. We learn how to approach such questions in Chapter 4].

3.8 Illustrated Problems

63

8. The life time in months, X, of light bulbs produced by two manufacturing plants A and B are exponential with = 1/4 and = 1/2, respectively. Plant B produces 3 times as many bulbs as plant A. The bulbs are mixed together and sold. a) What is the probability that a light bulb purchased at random will last at least ve months? b) Given that a light bulb has last more than ve months, what is the probability that it is manufactured by plant A? 9. The input voltage X to an analog to digital converter (A/D) is a Gaussian(6, 16) random variable. The input/output relation of the A/D is given by X0 0<X4 Y = 2, 4 < X 8 3, 8 < X 12 4, X > 12 Find and plot the PMF of Y . 10. A study shows that the height of a randomly selected Canadian man is a Gaussian random variable with mean 175 cm and standard deviation 10 cm. A random Canadian man is selected. Given that his height is at least 165 cm, answer the following: a) What is the probability that his height is at least 175 cm? b) What is the probability that his height is at most 185 cm? 11. A Laplace() random variable is one with the following PDF. fX (x) = |x| e , 2 < x <
0, 1,

a) For = 1, nd the conditional PDF of X given that |X| > 2. b) For = 1, nd E[X||X| > 2]. 12. X is an Exponential() random variable. Let Z = X 2 . a) Find the PDF of Y = 3X + 1. b) Find the PDF of Z. c) Find E[Z] and VAR[Z].

64

Continuous Random Variables

13. Let Xbe uniform RV on [0, 2]. Compute the mean and variance of Y = g(X) where 0, x<0 1 2x, 0x< 2 g(x) = 1 2 2x, 2 x < 1 0, 1<x Repeat the above if X is exponential with a mean of 0.5. 14. The random variable W has the PDF 0<w<1 fW (w) = w 1, 1 w < 2 0, otherwise Let A = {W > 1} and B = {0.5 < W < 1.5}. a) Derive fW |A (w) and sketch it. b) Find FW |A (w)and sketch it. c) Derive fW |B (w)and sketch it. d) Find FW |B (w) and sketch it.
1 w,

3.9
1.

Solutions for the Illustrated Problems


a) Not valid (not monotonically increasing) b) Not valid (not a smooth function of x) c) A valid CDF. cos(x), 0 x 2 f (x) = 0, otherwise d) A valid CDF. a exp(a(x + 4)), 4 < x f (x) = 0, otherwise

2.

a) fY (y) =

dFY (y) dy

= 3 0,

1,

2 y 1 otherwise = 0.67
1 2 1 y 3

b) P [Y < 0] = F (0) = c) E[Y ] =


2 3

y f (y)dy =

dy = 0.5

3.9 Solutions for the Illustrated Problems d) E[Y 2 ] =


65 =1

y 2 f (y)dy =

1 2

1 2 y 3

dy =

8+1 9

VAR [Y ] = E[Y 2 ] E 2 [Y ] = 0.75 e) FY (a) = 1 FY (a) FY (a) = 0.5 y = 0.5 3. a) Solving (3 4x) dx = 1 3a 2a2 = 1 we get a {0.5, 1}. PDF fX (x) is non-negative only if 3 4a 0 a = 1. Therefore, a = 0.5 b) E[X 3 ] = 4. a)
a 0 0 a

x3 (3 4x)dx = 0.75a4 0.8a5 = 0.0219


4 2

fX (x)dx = 1 (cx + 1)dx = 0.5 6c + 2 = 0.5 c = 0.25


0, x ,

x0 x 0<x2 b) FX (x) = fX (x)dx = 4 x2 1 + x 8 , 2 < x 4 1, x>4 c) P [1 < X < 3] = P [1 < X 3] = FX (3)FX (1) = 0.8750.25 = 0.625 5. The PDF of an Exponential() random variable X is given by fX (x) = ex , x 0, > 0. a) Waiting time T is given to be an Exponential() random variable, 1 whose mean E[T ] = is 2. Thus we get = 0.5. b) Therefore PDF of T is fT (t) = 0.5e0.5t , t 0. P [T > 5] =
5

0.5e0.5t dt = e2.5 = 0.0821


3

c) We know that the exponential distribution is memoryless. Therefore P [T > 8|T > 5] = P [T > 3] = 0.5e0.5t dt = e1.5 = 0.2231
2.3026

Note: You can derive the same result using conditional probability. d) P [T < a] = 1 e0.5a = 0.9 a = ln(0.1) = e) P [T < 3|T < 5] =
k1 P [T <5|T <3]P [T <3] P [T <5]

= 4.6052 min

1(10.2331) 10.0821

= 0.8464

6. The CDF of an Erlang(k, ) random variable X is given by FX (x) = 1


n=0

ex (x) . n!
2

a) Therefore, CDF of time interval T Erlang(3, ) is: FT (t) = 1


n=0 t e0.5t (0.5t) = 1 e0.5t 1 + 2 + n!
n

t2 8

Note: all time values are measured in milliseconds.

66

Continuous Random Variables b) P [T > 10] = 1 FT (10) = e5 (1 + 5 + 12.5) = 0.1247


k 3 c) E[T ] = = 0.5 = 6 ms k Var[X] = 2 = 12 ms

d) Server outage makes T Erlang(2, ) and FT (t) = 1 Thus, P [T > 10] = 1 P [T < 10] =
1 n=0
n e5 5 n!

1 n=0

e0.5t (0.5t) . n!

= 0.0404

7. Let X be a random variable denoting the height of a Canadian man (in cm) under consideration. Then we know that, X N (175, 100). a) P [X > 165] = P 0.84134 = 84.134% b) P [{X < 160} {X > 195}] = 1 P [160 < X < 195] [ ] X 175 = 1 P 1.5 < <2 10 = 1 ((2) (1.5)) = 1 (0.97725 0.06681) = 0.8956 c) Let Y be a random variable denoting the height of a Canadian woman (in cm) under consideration. Then we know that, Y N (165, 2 ), whose variance 2 is to be determined. Given 91% of women are shorter than)177cm, [ ] ( P [Y < 177] = P Y 165 < 12 = 12 = 0.91 12 Therefore, = 112 = 1.34097 = 8.9487 VAR[Y ] = 80.08 (0.91) d) P [Y > 165] = 0.5 = 50%. Therefore, when compared to Canadian men, a Canadian woman is less likely to be taller than 165 cm. 8. The conditional PDFs fX|A (x) = 0.25e0.25x , x 0 and fX|B (x) = 0.5e0.5x , x 0 of life time X of a bulb are known, given the plant which manufactured it. We also know that P [B] = 3P [A]. a) Solving P [A] + P [B] = 1 (since any bulb is manufactured in one of the plants A and B) and P [B] = 3P [A] we get: P [A] = 0.25, P [B] = 0.75. Applying the law of total probability, P [X > 5] = P [X > 5|A]P [A] + P [X > 5|B]P [B] = e5/4 (0.25) + e5/2 (0.75) = 0.1332 b) Using the Bayes rule P [A|X > 5] = 0.5377
P [X>5|A]P [A] P [X>5]

X175 10

> 1 = 1 P

X175 10

1 = 1 (1) =

e5/4 (0.25) 0.1332

0.0716 0.1332

3.9 Solutions for the Illustrated Problems

67

9. Y can only take values in {0, 1, 2, 3, 4}. ) ( ( ) P [Y = 0] = P [X 0] = 06 = 1 6 = 1 (1.5) 0.067 = ( 4 ) ( )4 ( ) ( ) 46 06 6 2 P [Y = 1] = P [0 < X 4] = 4 4 = 4 4 = 0.241 ( ) ( ) ( ) P [Y = 2] = P [4 < X 8] = 86 46 = 2 2 1 1.383 1 = 0.383 = ( 4 ) (4 ) ( 4) ( ) 86 6 126 P [Y = 3] = P [8 < X 12] = 4 4 = 4 2 0.241 4 = ( ) ( ) 126 6 P [Y = 4] = P [X 12] =1 =1 = 0.067
4 4

10.

a) P [X > 175|x > 165] = b) P [X < 185|X > 165] =

P [X>175] P [X>165]

0.5 (1)

0.5 0.8413

= 0.5943 =
0.6826 0.8413

P [165<X<185] P [X>165]

2(1)1 (1)

= 0.8114

11.

a) FX (x| |X| > 2) =


2

P [|X|>2,Xx] P [|X|>2]

F (x) , P [|X|>2]

x < 2 2 x 2 x>2

F (2) , P [|X|>2] F (x)P [|X|<2] , P [|X|>2]

P [|X| > 2] = e Therefore, the PDF is equal to 2|x| e 2 , x < 2 fX (x| |X| > 2) = 0, 2 x 2 e2||x| , x>2 2 b) E (X| |X| > 2) =

x fX (x| |X| > 2) dx = 0 (Due to the symmetry of

the conditional PDF) 12. a) fX (x) = x/10 , x > 0 0.1e y1 y1 30 fX ( 3 ) = e 30 , y > 1 3 fY (y) = 0, otherwise b) fZ (z) = c) E[Z] E[Z 2 ] = =
fX (z ) 0,
2 z

10 e 20 z

, z>0 otherwise

x2 fx (x) dx = x4 fx (x)dx =

1 10 1 10

0 0

x2 ex/10 dx = 100 (3) = 200

x4 ex/10 d = 10000 (5) = 240000

VAR[Z] = 240000 40000 = 200000 13. Given X Uniform(0, 2)

68 E[Y ] = E[g(X)] = = E[Y 2 ]


1/2 0

Continuous Random Variables g(x)fX (x) dx


1 1/2

2x

( )
1 2

dx +

(2 2x)

( )
1 2

dx =

1 4

= E[g 2 (X)] = =
1/2 0 2

(g(x))2 fX (x)dx
1 1/2 2 1 6

(2x)2

( )
1 2

dx +

(2 2x)2
( )2
1 4

( )
1 2

dx =

1 6

VAR[Y ] = E[Y ] (E[Y ]) =

= 0.1042

For the exponential case 1 For X Exponential(), we know that E[X] = . Therefore, we nd = 2 and PDF fX (x) = 2e2x , x 0.

E[Y ] = E[g(X)] =
1/2

g(x)fX (x)dx
) 1

=
0

2x 2e

2x

dx +
1/2

(2 2x) 2e2x dx = 0.3996

E[Y 2 ] = E[g 2 (X)] =


1/2

(g(x))2 fX (x)dx
) 1

=
0

(2x)
2

2e

2x

dx +
1/2

(2 2x)2 2e2x dx = 0.2578

VAR[Y ] = E[Y ] (E[Y ]) = 0.09815


2

14. P [A] = P [W > 1] =

fW (w)dw = (w 1)dw =
1.5 0.5 1

1 2 1.5 1

P [B] = P [0.5 < W < 1.5] =

fW (w)dw =

1 0.5

(1 w)dw +

(w 1)dw =

1 4

a) Conditioning on A we get, 2(w 1), 1 w 2 fW |A (w) = 0, otherwise w<1 b) FW |A (w) = fW |A (t)dt = (w 1) , 1 w < 2 1, w2
w 2

0,

3.10 Drill Problems c) Conditioning on B we get, 4(1 w), 0.5 < w < 1 fW |B (w) = 4(w 1), 1 < w < 1.5 0, otherwise
w

69

0, (

d) FW |B (w) =

fW |B (t)dt =

0.5 + 2(w 1)2 ,

1 2

w < 0.5 (3 2w), 0.5 w 1 1 w < 1.5 w 1.5

1,

3.10

Drill Problems

Section 3.1 and Section 3.2 - PDF and CDF


1. Test if the following functions are valid PDFs. Provide brief details on how you tested each. a) f (x) = b) f (x) =
0.75x(2 x) 0 0.5ex 0 2x 1

0x2 otherwise

0<x< otherwise 0 x 0.5(1 + otherwise |x| 1 otherwise x<0 0x2 x>2

c) f (x) = 0 d) f (x) = Ans a) Yes; F (x) =

5)

0.5(x + 1) 0 0 2

x (3x) 4

b) Not a PDF; since

f (x)dx = 1. x < 1 1 x 1 x>1

c) Not a PDF; since f (x) < 0 for some x. d) Yes; F (x) = For the valid PDFs, derive the corresponding CDF.

(x+1)2 4

2. The cumulative distribution function of random variable X is 0 x < 1, FX (x) = (x + 1)/2 1 x < 1, 1 x 1.

70 a) What is P [X > 1/2]? b) What is P [1/2 < X 3/4]? c) What is P [|X| 1/2]?

Continuous Random Variables

d) What is the value of a such that P [X a] = 0.8? e) Find the PDF fX (x) of X. Ans a) 1/4 e) b) 5/8 c) 1/2 d) 0.6
{

fX (x) =

1/2 , 1 x 1 0 , otherwise

3. The random variable X has probability density function


{

fX (x) = Use the PDF to nd a) the constant c, b) P [0 X 1], c) P [1/2 X 1/2], d) the CDF FX (x). Ans a)1/2 d) b) 1/4 c) 1/16

cx 0 x 2, 0 otherwise.

FX (x) =

,x < 0 x2 /4 , 0 x 2 1 ,x > 2

Section 3.3 - Expected Values


4. Continuous random variable X has PDF
{

fX (x) =

1/4 1 x 3, 0 otherwise.

Dene the random variable Y by Y = h(X) = X 2 . a) Find E[X] and VAR[X]. b) Find h(E[X]) and E[h(X)]. c) Find E[Y ] and VAR[Y ].

3.10 Drill Problems Ans a) E[X] = 1, VAR[X] = 4/3 b) h(E[X]) = 1, E[h(x)] = 7/3 c) E[Y ] = 7/3, VAR[Y ] = 304/45 5. The cumulative function of random variable U is
0 (u + 5)/8

71

FU (u) =

u < 5, 5 u < 3, 1/4 3 u < 3, 1/4 + 3(u 3)/8 3 u < 5, 1 u 5.

a) What is E[U ]? b) What is VAR[U ]? c) What is E[2U ]? Ans a) 2

b) 37/3

c) 13.001

Section 3.4 - Families of RVs


6. You take bus to the university from is uniformly distributed between 40 VAR [X], (b) the probability that it probability that it takes less than 45 your home. The time X for this trip and 55 minutes. (a) Find E[X] and takes more than 50 minutes. (c) The minutes.

Ans a) E[X] = 47.5 min. Var[X] = 18.75 min2 . b)1/3 c) 1/3 7. Let X be uniform RV on [0, 2]. Compute the mean and variance of Y = g(X) where 0 x<0 2x 0<x< 1 2 g(x) = 2 2x 1 < x < 1 2 0 1 < x. Ans case: X Uniform(0,2) E[Y] = 1/4 Var[Y] = 5/48

72

Continuous Random Variables 8. The random variable X, which represents the life time in years of a communication satellite, has the following PDF: fX (v) =
0.4e0.4v 0

v0 otherwise

What family of random variables is this? Find the expected value, variance and CDF of X. What is the probability that the satellite will last longer than 3 years? If the satellite is 3 years old, what it the probability that it will last for another 3 years or more? Ans Exponential, E[X] = 2.5, Var[Y] = 6.25 FX (v) =
0 1 e
0.4v

v<0 v0

P [X > 3] = 0.3012 P [X > 6|X > 3] = P [X > 3] = 0.3012 9. The life time in months, X, of light bulbs (identical specications) produced by two manufacturing plants A and B are exponential with = 1/5 and = 1/2, respectively. Plant B produces four times as many bulbs as plant A. The bulbs are mixed together and sold. What is the probability that a light bulb purchased at random will last at least (a) two months; (b) ve months; (c) seven months. Ans a) 0.4284 b) 0.1392 c) 0.0735

10. X is a Erlang(3, 0.2). Calculate the value of P [1.3 < X < 4.6]. Ans 0.0638

Section 3.5 - Gaussian RVs


11. A Gaussian random variable, X, has a mean of 10 and a variance of 12. a) Find the probability that X is less than 13. b) Find P [1 < X < 1]. c) If Y = 2X + 3, nd the mean and variance of Y . d) Find P [0 < Y 80]. Ans a) 0.8068 b) 0.00394 c) 23, 48 d) 0.9995

12. A Gaussian random variable, X, has an unknown mean but a standard deviation of 4.

3.10 Drill Problems

73

a) The random variable is positive on 32% of the trials. What is the mean value? b) This random variable is changed to another Gaussian random variable through the linear transformation Y = X + 1. Find the expected value 2 of Y . c) Find the variance of Y . d) Find the mean of the square of Y . Ans a) -1.871

b) 0.0646

c) 4

d) 4.0042

Section 3.6 - Function of a RV


13. X is a Uniform(0,4) RV. The RV Y is obtained by Y = (X 2)2 . a) Derive the PDF and CDF of Y . b) Find E[Y ]. c) Find VAR [Y ]. Ans a)

fY (y) =

1 4 y

0y4 otherwise y<0 0y4 y>4

FY (y) =

y 2

b)E[Y]=4/3 c)Var[Y]=64/45 14. The RV X is N (1,1). Let Y =


1, 2,

X0 X>0

a) Find the PDF and CDF of Y . b) Find E[Y ]. c) Find VAR [Y ].

74 Ans a)

Continuous Random Variables

fY (y) = 0.5 for y {1, 2}, ZOW. y<1 FY (y) = 0.5 1 y < 2 1 y2
0

b)E[Y]=3/2 c)Var[Y]=1/4

Section 3.7 -Conditioning


15. The random variable X is uniformly distributed between 0 and 5. The event 2 B is B = {X > 3.7}. What are fX|B (x), X|B , and X|B ? Ans

2

fX|B (x) = 13 0
2 X|B = 4.35, X|B = 0.1408

3.7 x 5 otherwise

16. Let X be an exponential random variable.


{

FX (x) =

1 ex/3 , x 0 , 0 ,x < 0

2 and let B be the event B = {X > 2}. What are fX|B (x), X|B , and X|B ?

Ans fX|B (x) =


2 X|B = 4.9997, X|B = 9.0007

2x 1e 3 0
3

x>2 otherwise

17. X is Gaussian with a mean of 997 and a standard deviation of 31. What is the probability of B where B = {X > 1000}? And what is the pdf for X conditioned by B? Ans P [X > 1000] = 0.4615 fX|B (x) =

fX (x) P [X>1000]

x > 1000 otherwise

Chapter 4 Pairs of Random Variables


4.1 Joint Probability Mass Function

Denition 4.1: The joint PMF of two discrete RVs X and Y is PX,Y (a, b) = P [X = a, Y = b].

Theorem 4.1: The joint PMF PX,Y (x, y) has the following properties 1. 0 PX,Y (x, y) 1 for all (x, y) SX,Y . 2.

(x,y)SX,Y

PX,Y (x, y) = 1.

(x,y)B

3. For event B, P [B] =

PX,Y (x, y).

4.2

Marginal PMFs

Theorem 4.2: For discrete RVs X and Y with joint PMF PX,Y (x, y), the marginals are PX (x) = PX,Y (x, y) and
ySY

PY (y) =

PX,Y (x, y).

xSX

76

Pairs of Random Variables

4.3

Joint Probability Density Function

Denition 4.2: The joint PDF of the continuous RVs (X, Y ) is dened (indirectly) as
x y

fX,Y (u, v)dudv = FX,Y (x, y).

Theorem 4.3: fX,Y (x, y) =

2 FX,Y (x, y) xy

Theorem 4.4: A joint PDF fX,Y (x, y) satises the following two properties 1. fX,Y (x, y) 0 for all (x, y). 2.

fX,Y (x, y)dxdy = 1.

Theorem 4.5: The probability that the continuous random variables (X, Y ) are in B is P [B] = fX,Y (x, y)dxdy.
(x,y)B

4.4

Marginal PDFs

Theorem 4.6: For (X, Y ) pair with joint PDF fX,Y (x, y), the marginal PDFs are

fX (x) = fY (x) =

fX,Y (x, y)dy fX,Y (x, y)dx

4.5

Functions of Two Random Variables

Theorem 4.7: For discrete RVs X and Y , the derived random variable W = g(X, Y ) has PMF, PW (w) = PX,Y (x, y)
(x,y):g(x,y)=w

i.e., a sum of all PX,Y (x, y) where x and y subject to g(x, y) = w. We use this theorem to calculate probabilities of event {W = w}.

4.6 Expected Values Theorem 4.8: For continuous RVs X and Y , the CDF of W = g(X, Y ) is FW (w) = P [W w] =

g(x,y)w

77

fX,Y (x, y)dxdy.

It is useful to draw a picture in the plane to calculate the double integral. Theorem 4.9: For continuous RVs X and Y , the CDF of W = max(X, Y ) is

FW (w) = FX,Y (w, w) =

fX,Y (x, y)dxdy.

4.6

Expected Values
g(x, y)PX,Y (x, y)

Theorem 4.10: For RVs X and Y , the expected value of W = g(X, Y ) is discrete continuous E[W ] =

xSX ySY g(x, y)fX,Y (x, y)dxdy

Note -the expected value of W can be computed without its PMF or PDF. Theorem 4.11: E[g1 (X, Y ) + + gn (X, Y )] = E[g1 (X, Y )] + + E[gn (X, Y )] Theorem 4.12: The variance of the sum of two RVs is VAR [X + Y ] = VAR [X] + VAR [Y ] + 2E[(x X )(y Y )]. Denition 4.3: The covariance of two RVs X and Y is dened as Cov[X, Y ] = E[(X X )(Y Y )] = E[XY ] X Y . Theorem 4.13: 1. VAR [X + Y ] = VAR [X] + VAR [Y ] + 2Cov[X, Y ]. 2. If X = Y , Cov[X, Y ] = VAR [X] = VAR [Y ] Denition 4.4: If Cov[X, Y ] = 0, RVs X and Y are said to be uncorrelated. Denition 4.5: The correlation coecient of RVs X and Y is X,Y = Theorem 4.14: 1 X,Y 1.
Cov[X,Y ] . X Y

78

Pairs of Random Variables

4.7

Conditioning by an Event
{

Theorem 4.15: For event B, a region in the (X, Y ) plane with P [B] > 0, PX,Y |B (x, y) =
PX,Y (x,y) P [B]

(x, y) B . otherwise

We use this to calculate the conditional joint PMF, conditioned on an event. Theorem 4.16: For continuous RVs X and Y and event B with P [B] > 0, the conditional joint PDF of X and Y given B is
{

fX,Y |B (x, y) = .

fX,Y (x,y) P [B]

(x, y) B otherwise

Theorem 4.17: For RVs X and Y and an event B with P [B] > 0, the conditional expected value of W = g(X, Y ) given B is E[W |B] = .
g(x, y)PX,Y |B (x, y)
xSX ySY g(x, y)fX,Y |B (x, y)dxdy

discrete continuous

4.8

Conditioning by an RV

Denition 4.6: For event {Y = y} with non-zero probability, the conditional PMF of X is PX|Y (x|y) = P [X = x|Y = y] = P [X = x, Y = y] PX,Y (x, y) = . P [Y = y] PY (y)

Theorem 4.18: For discrete RVs X and Y with joint PMF PX,Y (x, y) and x and y such that PX (x) > 0 and PY (y) > 0, PX,Y (x, y) = PX|Y (x|y)PY (y) = PY |X (y|x)PX (x). This allows us to derive the joint PMF from conditional joint PMF and marginal PMF.

4.9 Independent Random Variables Denition 4.7: The conditional PDF of X given {Y = y} is fX|Y (x|y) = where fY (y) > 0. Similarly, fY |X (y|x) = fX,Y (x, y) . fX (x) fX,Y (x, y) fY (y)

79

Theorem 4.19: X and Y are discrete RVs. Find any y SY , the conditional expected value of g(X, Y ) given Y = y is E[g(X, Y )|Y = y] =

xSX

g(x, y)PX|Y (x|y).

Denition 4.8: For continuous RVs X and Y , and any y such that fY (y) > 0, the conditional expected value of g(X, Y ) given Y = y is

E[g(X, Y )|Y = y] =

g(x, y)fX|Y (x|y)dx.

To calculate the conditional moments, we need the conditional joint PMF and PDF rst. Theorem 4.20: Iterated Expectation E[E[X|Y ]] = E[X].

4.9

Independent Random Variables

Denition 4.9: RVs X and Y are independent if and only if Discrete: PX,Y (x, y) = PX (x)PY (y). Continuous: fX,Y (x, y) = fX (x)fY (y), for all values of x and y. Theorem 4.21: For independent RVs X and Y , 1. E[g(X)h(Y )] = E[g(X)]E[h(Y )]. Cov[X, Y ] = 0. E[XY ] = E[X]E[Y ] thus

2. VAR [X + Y ] = VAR [X] + VAR [Y ] E[X|Y = y] = E[X] E[Y |X = x] = E[Y ].

80

Pairs of Random Variables

4.10

Bivariate Gaussian Random Variables


[( )2 ]

Denition 4.10: X and Y are bivariate Gaussian with parameters 1 , 1 ,2 , 2 , if the joint PDF is fX,Y (x, y) = 1 e
1 2(12 ) x1 1

)2

2(x1 )(y2 ) + 1 2

y2 2

21 2 1 2

where 1 , 2 can be any real numbers, 1 > 0, 2 > 0 and 1 < < 1. Theorem 4.22: If X and Y are the bivariate Gaussian RVs, then X is N
2 (1 ,1 )
(x1 )2 2 2 1

and Y is N (2 , .

2 2 ).

That is, fX (x) =

1 2e 21

, fY (y) =

1 2e 22

(y2 )2 2 2 2

Theorem 4.23: Bivariate Gaussian RVs X and Y have the correlation coecient X,Y = . Theorem 4.24: Bivariate Gaussian RVs X and Y are uncorrelated if and only if they are independent, i.e., = 0 implies that X and Y are independent.

4.11

Illustrated Problems
a) Find P [X < Y ]. b) Find E[Y ]. c) Find E[X|Y = 2]. Y 1 2 3 4 X 0 0.03 0.05 0.02 0.05 1 0.10 0.07 0.20 0.23 2 0.02 0.02 0.02 0.04 3 0.02 0.05 0.05 0.03

1. The joint PMF of two random variables is given in Table 4.1.

Table 4.1: for Question 1 2. The joint PDF of X and Y is given as


c(3x2 + 2y), fX,Y (x, y) = 0,

0 < x < 1, 0 < y < 1 . otherwise

4.11 Illustrated Problems a) Find the value of c to ensure a valid PDF. b) Find the PDF of X. 3. The joint PDF of X and Y is fX,Y (x, y) = 0,
cx2 y,

81

0 x < 1, 0 y 2 . otherwise

a) Find the value of constant c. b) Find P [Y < 1]. c) Find P [Y < X]. d) Find P [Y > X 2 ]. e) Find the PDF of V = min{X, Y }. f) Find the PDF of U = X/Y . 4. The joint PDF of X and Y is fX,Y (x, y) =
2.5x2 , 0,

1 x 1, 0 y x2 . otherwise

a) Find the marginal PDF of X. b) Find the marginal PDF of Y . c) Find Var[X] and Var[Y ]. d) Find Cov[X, Y ] and X,Y (the correlation coecient of X and Y ). 5. Let X and Y be jointly distributed with fX,Y (x, y) =
ke3x2y , 0,

x, y 0 . otherwise

a) Find k and the PDFs of X and Y . b) Find means and variances of X and Y . c) Are X and Y independent? Find X,Y . 6. X and Y are jointly distributed with fX,Y (x, y) =
ke3x2y ,

0,

0yx . otherwise

a) Find k and the marginal PDFs and CDFs of X and Y .

82 b) Find means and variances of X and Y . c) Are X and Y independent? Find X,Y .

Pairs of Random Variables

7. Let Z = X + Y , where X and Y are jointly distributed with fX,Y (x, y) =


c, 0,

x 0, y 0, x + y 1 . otherwise

a) Find the PDF and CDF of Z. b) Find the expected value and variance of Z. 8. Two random variables X and Y are jointly distributed with fX,Y (x, y) =
(x+y) , 0,
3

0 x 1, 0 y 2 . otherwise

Let event A be dened as A = {Y 0.5}. a) Find P [A]. b) Find the conditional PDF fX,Y |A (x, y). c) Find the conditional PDF fX|A (x). d) Find the conditional PDF fY |A (y). 9. In Question 4, dene B = {X > 0}. a) Find fX,Y |B (x, y). b) Find VAR[X|B]. c) Find E[XY |B]. d) Find fY |X (y|x). e) Find E[Y |X = x]. f) Find E[E[Y |X]]. 10. Let X and Y be jointly distributed with fX,Y (x, y) = a) Find the PDF fY (y). b) Find the conditional PDF fX|Y (x|y). c) Find the conditional expected value E[X|Y = y].
2, 0,

0yx1 . otherwise

4.12 Solutions for the Illustrated Problems 11. Let X and Y be jointly distributed with fX,Y (x, y) =
(4x+2y) ,
3

83

0,

0 x 1, 0 y 1 . otherwise

a) Find the PDFs fY (y) and fX (x). b) Find the conditional PDF fX|Y (x|y). c) Find the conditional PDF fY |X (y|x). 12. The joint PDF of two RVs is given by fX,Y (x, y) =
|xy| , 0,
8

x2 + y 2 1 . otherwise

Determine if X and Y are independent. 13. X and Y are independent, with X Gaussian(5, 15) and Y Uniform(4,6). a) Find E[XY ]. b) Find E[X 2 Y 2 ].

4.12
1.

Solutions for the Illustrated Problems


a) P [X < Y ] = 0.03+0.05+0.07+0.02+0.20+0.02+0.05+0.23+0.04+0.03 = 0.74 (see: shaded in the table below) b) E[Y ] = 1(0.17) + 2(0.19) + 3(0.29) + 4(0.35) = 2.82 c) E[X|Y = 2] = 0(0.05)+1(0.07)+2(0.02)+3(0.05) = 0.19 with bold text in the table below)
Y X 0.26 0.19

= 1.3684 (see: noted P[Y] 0.17 0.19 0.29 0.35


1 2

1 2 3 4 2. a) 1 =

0 0.03 0.05 0.02 0.05


1

1 0.10 0.07 0.20 0.23


1

2 0.02 0.02 0.02 0.04

3 0.02 0.05 0.05 0.03

fX,Y (x, y)dxdy = c

f X,Y (x, y)dy, b) fX (x) = 0, 1+3x2 , 0 < x < 1

x=0 y=0

(3x2 + 2y)dxdy = 2c c =

0 < x < 1 c 01 (3x2 + 2y)dy, 0 < x < 1 = 0, otherwise otherwise

0,

otherwise

84 3. a)

Pairs of Random Variables fX,Y (x, y)dxdy = c


1 0 1 2

x=0 y=0

x2 ydxdy = 1
1 4 3 4 1 0 25 28

2c 3

=1c=

3 2

b) P [Y < 1] =

1.5x2 dx ydy =
0 x 0 1 0

c) P [Y < X] =

1 0

1.5x2 ydydx = 1.5x2


2 x2

x4 dx =

3 20

d) P [Y > X 2 ] =

ydydx =

e) Case v < 0: both X and Y are always greater than v. FV (v) = 0 Case v > 1: at least X is always smaller than v. FV (v) = 1. Otherwise: FV (v) = P [V v] = P [min(X, Y ) v] = 1 v v f (x, y)dxdy = 1 v2 v1 3 x2 ydxdy 2 = 0.25v 2 + v 3 0.25v 5 v<0 Thus we have: FV (v) = 0.25v + v 0.25v , 0 v 1 1, v>1
2 3 5

0,

0.5v + 3v 2 1.25v 4 , As a result: fV (v) = 0,

0<v<1 otherwise

f) FU (u) = P [U < u] = P [
2
uy y=0 x=0 1 x/u

X < u] Y u < 0.5 u > 0.5

1.5x2 y dxdy,

1 1.5x2 y dydx, x=0 y=0 3.2u3 , u < 0.5 1


3 , 20u2

u > 0.5

As a result: fU (u) =

9.6u2 ,

u < 0.5 0.3u3 , u > 0.5

4.

f X,Y (x, y)dy, 1 x 1 a) fX (x) = 0, otherwise 2 x 2.5x2 dy, 1 x 1

0,

otherwise

4.12 Solutions for the Illustrated Problems = b)


f X,Y (x, y)dx, 0 y 1 fY (y) = 0, otherwise y 1 2.5x2 dx + 2.5x2 dx, 0 y 1 = 1 y 0, otherwise ( ) 5 1 y 3/2 , 0 y 1 2.5x4 , 0,

85

1 x 1 otherwise

0,

otherwise
1

c) E[X] E[X ] E[Y ] E[Y ]


2 2

x fX (x)dx = 2.5
2

= x fX (x)dx = 2.5 = y fY (y)dy = = y fY (y)d =

1 1

x5 dx = 0 x6 dx =
)
5 7 5 14 5 27

5 3 5 3

1 1 0 0

y 1 y 3/2 dy = y 2 1 y 3/2 dy =
5 0= 5 7 ( )7 2 5 5 14 27 1 x2

VAR[X] = E[X 2 ] E 2 [X] = VAR[Y ] = E[Y 2 ] E 2 [Y ] = d) E[XY ] =

= 0.05763

xyfX,Y (x, y)dxdy =

xy (2.5x2 ) dydx = 0
COV [X,Y ] VAR[X]VAR[Y ]

x=1 0

COV [X, Y ] = E[XY ] E[X]E[Y ] = 0 and x,y = 0 5. a)

fX,Y (x, y)dxdy = k

0 0

e3x2y dxdy =
0

k 6

=1k=6 x0 otherwise

fX (x) = fX,Y (x, y)dy = =


3e3x , 0,

3x2y 6 e dy,

0,

x0 otherwise

fY (y) = fX,Y (x, y)dy = 0 0, =


2e2y , 0,

3x2y 6 e dx,

y0 otherwise

y0 otherwise

86 b) E[X] =
2

Pairs of Random Variables

x fX (x)dx = 3

xe3x dx =
0

1 3 2 9

E[X ] = E[Y ] =
2

x fX (x)dx = 3
0

x2 e3x dx =
1 2 2 4

VAR[X] =

2 9

( )2
1 3

1 9

y fY (y)dy = 2

ye2y dx =
0

E[Y ] =

y fY (y)dy = 2

y 2 e2y dy =

VAR[X] =

2 4

( )2
1 4

1 4

c) Yes; because fX,Y (x, y) = fX (x)fY (y) for every x and y. X,Y = 0 (because X and Y are uncorrelated) 6. a)

fX,Y (x, y)dxdy = k

x 0 y=0

e3x2y dydx =
0

k 15

= 1 k = 15

fX (x) = fX,Y (x, y)dy =


7.5 (e3x e5x ) , = 0,

x 3x2y 15 e dy,

x0 otherwise

0,

x0 otherwise

fY (y) = fX,Y (x, y)dy = =


5e5y ,

15

e3x2y dx, y 0 otherwise

0,

0,

y0 otherwise
0

b) E[X] =
(

x fX (x)dx = 7.5

x (e3x e5x ) dx =
0

8 15 98 225

E[X 2 ] =
98 225

x2 fX (x)dx = 7.5 =
34 225 0

x2 (e3x e5x ) dx =

VAR[X] =

8 15

)2

E[Y ] =
2

y fY (y)dy = 5

ye5y dx =
0

1 5 2 25

E[Y ] = =
1 25

y fY (y)dy = 5

y 2 e5y dy =

VAR[X] =

2 25

( )2
1 5

c) No; because (x, y) = fX (x)fY (y)for some x and y. fX,Y E[XY ] = xy fX,Y (x, y)dxdy = 15 = 15 = 15
x

xye3x2y dydx
( (
x

x=0 y=0 3x

xe

ye2y dy dx
)

x=0 x=0

y=0 1(1+2x)e2x 4

xe3x

dx =

11 75

By denition: E[XY X,Y = ]E[X]E[Y ] = 11/75(8/15)(1/5) =


VAR[X]VAR[Y ] (34/225)(1/25)

1/25 34/75

3 34

= 0.5145

4.12 Solutions for the Illustrated Problems 7. a)

87 =1c=2 z<0

fX,Y (x, y)dxdy = c

1y

dxdy =

y=0 x=0

c 2

Consider the CDF of Z. 0, FZ (z) = P [Z z] = z<0 = z , 0z1 1, z > 1


2

2 1,

zx

x=0 y=0

dydx, 0 z 1 z>1

0,

fZ (z) = b) E[Z] =
2

d F (z) dz Z 1 0

=
2 3 1 2

2z,

0,

0z1 otherwise

2z 2 dz =

E[Z ] = 8. a) P [A] =

1 0

2z 3 dz =
1

VAR[Z] =
1 8

1 2

( )2
2 3

1 18

0.5

fX,Y (x, y)dxdy =


fX,Y (x,y) , 0,
1/8

y=0 x=0

b) fX,Y |A (x, y) = =
8(x+y) , 0,
3

A is True A is False

0 x 1, 0 y 0.5 otherwise
0.5 0

c) fX|A (x) =

fX,Y |A (x, y)dy = 3 0,


0,

4x+1 ,

d) fY |A (y) = fX,Y |A (x, y)dx =


0

4(1+2y) ,
3

0x1 otherwise 0 y 0.5 otherwise

9.

a) Because of the symmetry around the y axis, it is easy to see that P [B] = 0.5. Thus: 2 2f X,Y (x, y), 0 < x 1, 0 y x fX,Y |B (x, y) = 0, otherwise =
5x2 , 0,

0 < x 1, 0 y x2 otherwise
2 x 5x2 dy, 0

b) fX|B (x) = fX,Y |B (x, y)dy =

0<x1 otherwise

0,

88 =
5x4 , 0,

Pairs of Random Variables 0<x1 otherwise As a result: 1 E[X|B] = x fX|B (x, y)dy = 5x5 dy = E[X |B] =
2 5 7 0 2 1 0

5 6 5 7

x fX|B (x, y)dy = 5x6 dy =


( )2
5 6

VAR[X|B] = c) E[XY |B] = =


1 x2 x=0 y=0 5 = 16

5 252

xy fX,Y |B (x, y)dxdy


1 0

xy (5x2 ) dydx = 2.5 x7 dx


2.5x2 , 0,
2.5x4

d) fY |X (y|x) =

fX,Y (x,y) fX (x)

1 x 1, 0 < y x2 otherwise

1 , x2

0,

1 x 1, 0 < y x2 otherwise
2 x 0
1 ydy, x2

e) E[Y |X = x] = =
0.5x2 , 0,

1 x 1 otherwise

0,

1 x 1 otherwise
1 1

f) E [E[Y |x]] = 0.5

x2 fX (x)dx =

1 1

0.5x2 (2.5x4 ) dx =

5 14

(compare with E[Y ] found in question 4).


1 2 dx,
y

10.

a) fY (y) = fX,Y (x, y)dx = =


2(1 y), 0,

0y1 otherwise

0,

0y1 otherwise

b) fX|Y (x|y) =

fX,Y (x,y) fY (y)

0,

1 , 1y

0yx1 otherwise

1 1y 1 y

c) E[X|Y = y] =
y+1 , 0,
2

x fX|Y (x|y)dx =

xdx, 0 y 1 otherwise

0,

0y1 otherwise

4.13 Drill Problems


1
0 4x+2y dx, 3

89 0y1 otherwise

11.

a) fY (y) = fX,Y (x, y)dx = =


2(y+1) , 0,
3

0,

0y1 otherwise

fX (x) = fX,Y (x, y)dy = =


4x+1 , 0,
3

1
0

4x+2y dy, 3

0x1 otherwise

0,

0x1 otherwise
fX,Y (x,y) fY (y)

b) fX|Y (x|y) =

2x+y , 0,
1+y

0 x 1, 0 y 1 otherwise 0 x 1, 0 y 1 otherwise

c) fY |X (y|x) =

fX,Y (x,y) fX (x)

= 4x+1 0,

4x+2y ,

12. No, for example: if we know that X equals 2, then Y can only be zero. In other words, information about X can change the probability of Y . The other way to show this is to nd marginal distributions and see that the product of the marginal PDFs is not equal to the joint PDF. 13. From given data: E[X] = 5, VAR[X] = 15, E[Y ] = (64)2 =1 12 3
4+6 2

= 5, VAR[Y ] =

a) X and Y are independent. Therefore: E[XY ] = E[X]E[Y ] = 5 5 = 25 b) Similarly, E[X 2 Y 2 ] = E[X 2 ]E[Y 2 ] = (VAR[X] + (E[X])2 ) (VAR[Y ] + (E[Y ])2 ) = (15 + 25) (1/3 + 25) = 1013.33

4.13

Drill Problems

Section 4.1,4.2,4.3,4.4 aand 4.5 - Joint and marginal PDF/PMFs


1. The joint PDF fX,Y (x, y) = c, for (0 < x < 3) and (0 < y < 4), and is 0 otherwise. a) What is the value of the constant c? b) Find the marginal PDFs fX (x) and fY (y)?

90

Pairs of Random Variables

Figure 4.1: Figure for drill problem 3. c) Determine if X and Y are independent. Ans a) c = c) Yes 2. The joint PMF of discrete random variables X and Y is given by
0.2, x = 0, y = 0 0.3, x = 1, y = 0 PX,Y (x, y) = 0.3, x = 0, y = 1
1 12

b) fX (x) = 3 0,

1,

0x3 otherwise

fY (y) = 4 0,

1,

0y4 otherwise

c,

x = 1, y = 1

a) What is the value of the constant c? b) Find the marginal PMFs PX (x) and PY (y). Are X and Y independent? Ans a) c = 0.2 b) PX (x) = 0.5 for x {0, 1} and ZOW. ZOW. No

PY (y) = 0.5 for y {0, 1} and

3. Fig. 4.1 shows a region in the x-y plane where the bivariate PDF fX,Y (x, y) = cx2 . Elsewhere, the PDF is 0. a) Compute the value of c. b) Find the marginal PDFs fX (x) and fY (y). Ans a) c =
3 32

b) fX (x) =

2 x (2x) , 0,
32

2 x 2 otherwise

fY (y) =

3 y +8 , 0,
32

2 y 2 otherwise

4.13 Drill Problems

91

Section 4.6 - Functions of 2 RVs


4. Let X Uniform(0,3) and Y Uniform(-2,2). They are indepdent. Find the PDF of W = X + Y . Ans 2 w 1 1<w<2 fW (w) = 4 (5w) 12 , 2 w 5 0, otherwise
(w+2) 12 , 1 ,

Section 4.7,4.8,4.9 and 4.10 - Expected values, conditioning and independence


5. The joint PMF of discrete random variables X and Y is given by
0.3, x = 0, y = 0

PX,Y (x, y) =

a, x = 1, y = 0 , 0.3, x = 0, y = 1 b, x = 1, y = 1

where a and b are constants. X and Y are known to be independent. a) Find a and b. b) Find the conditional PMF PX,Y |A (x, y), where the event A is dened as {(x, y)|x y}. c) Compute PX|A (x) and PY |A (y). Are X and Y still independent (even when conditioned on A)? Can you explain why? Ans x = 0, y = 0 6/8, x = 0, y {0, 1} a) a = 0.2; b = 0.2 b) PX,Y |A (x, y) = 2/8, x = 1, y = 1 0, otherwise
6/8, y = 0 x=0 c) PX|A (x) = 2/8, x = 1 PY |A (y) = 2/8, y = 1 0, otherwise 0, otherwise No. Conditioning creates a dependency. 6/8, 3/8,

6. Reconsider the problem 3. Suppose an event A is dened as {(x, y)|x 0, y 0}. a) Compute the conditional joint PDF fX,Y |A (x, y).

92

Pairs of Random Variables b) Find the marginal PDFs fX|A (x) and fY |A (y). c) Are X and Y independent? Ans a) X,Y |A (x, y) f 3x2 , 2 x 0, 0 y 2 = 16 0, otherwise b) fX|A (x) =
3x2 , 0,
8

2 x 0 otherwise

fY |A (y) =

1, 0,
2

0y2 otherwise

c) Yes

7. E[X] = 2 and VAR [X] = 3. E[Y ] = 3 and VAR [Y ] = 5. The covariance Cov[X, Y ] = 0.8. What are the correlation coecient X,Y and the correlation E[XY ]? Ans X,Y = 0.2066 and E[XY ] = 6.8 8. X is a random variable, X = 4 and X = 5. Y is a random variable, Y = 6 and Y = 7. The correlation coecient is 0.2. If U = 3X + 2Y , what are VAR[U ], Cov[U, X] and Cov[U, Y ]? Ans Var[U ] = 337; Cov[U, X] = 29.4; Cov[U, Y ] = 83.6

Section 4.11 - Bivariate Gaussian RVs


9. Given the joint PDF fX,Y (x, y) of random variables X and Y : fX,Y (x, y) =
( ) 1 exp (x2 + 1.4xy + y 2 )/1.02 , < x, y < 1.42829

2 2 a) Find the means (X , Y ), the variances (X , Y ) of X and Y .

b) What is the correlation coecient ? Are X and Y independent? c) What are the marginal PDFs for X and Y ? Ans 2 2 a) X = Y = 0; X = Y =(1 b) = 0.7; No ) 2 c) fX (x) = fY (x) = 1 exp x 2 2

Chapter 5 Sums of Random Variables


5.1
5.1.1

Summary
PDF of sum of two RVs

Continuous case Theorem 5.1: The PDF of W = X + Y is

fW (w) =

fX,Y (x, w x)dx =

fX,Y (w y, y)dy.

Theorem 5.2: When X and Y are independent RVs, the PDF of W = X + Y is fW (w) = fX (x)fY (w x)dx = fX (w y)fY (y)dy.

It is actually a convolution of fX (x) and fY (y). Discrete case Theorem 5.3: The PMF of W = X + Y is PW (w) =

PX,Y (x, w x).

5.1.2

Expected values of sums

Theorem 5.4: For any set of RVs X1 , X2 , , XN , the expected value of SN = X1 + + XN is E[SN ] = E[X1 ] + + E[XN ]. Theorem 5.5: For independent RVs X1 , X2 , , XN the variance of SN = X1 + + XN is VAR[SN ] = N VAR[Xi ]. i=1

94

Sums of Random Variables

5.1.3

Moment Generating Function (MGF)


{

Denition 5.1: For a RVX, the moment generating function (MGF) of X is (s) = E[e
sX

]=

esx fX (x)dx . sxi PX (xi ) xSX e

Therefore, the MGF is a Laplace Transform of the PDF for a continuous RV and a Z Transform of the PMF for a discrete RV. Theorem 5.6: A RV X with MGF (s) has nth moment as E[X n ] = dn (s) dsn .
s=0

Theorem 5.7: For a set of independent RVs X1 , X2 , , Xn , the moment generating function of Sn = X1 + X2 + + Xn is Sn (s) = X1 (s) Xn (s) . We use this theorem to calculate the PMF or PDF of a sum of independent RVs. Theorem 5.8 central limit theorem (CLT): Let Sn = X1 + X2 + + Xn be a sum of n i.i.d. RVs with E[Xi ] = and VAR [X] = 2 . Then E[Sn ] = n and VAR [Sn ] = n 2 . The following holds: Sn n N (0, 1) as n . n 2 We use this theorem to approximate the PMF or PDF of Yn when the PMFs or PDFs of Xi are unknown but their means and variances are known to be identical.

5.2

Illustrated Problems
e(x+y) , 0,

1. Random variables X and Y have joint PDF 0 x, y , otherwise

fX,Y (x, y) =

What is the PDF of W = X + Y ?

5.2 Illustrated Problems 2. The joint PDF of two random variables X and Y is fX,Y (x, y) =
1,

95

0 x 1, 0 y 1 0, otherwise

a) Find the marginal distribution of X and Y . b) Show that X and Y are independent. c) Find E[X + Y ]. d) Find VAR[X + Y ]. e) Find the PDF of W = X + Y . 3. Consider X N (0,2). a) Find E[X 4 ]. (Hint: use a MGF table to answer this question.) b) Find E[X 5 ]. (Hint: while you can use MGF, it is better to use symmetry here.)
(e e 4. The MGF of W = X + Y + Z is equal to e 2s(1s) ) . If X is N (0,1) and Y is Exponential(1), and X, Y and Z are independent. Find the MGF of Z. What is the PDF of Z?
s s2 /2 s

5. Random variable Y has MGF Y (s) = 1/(1 s). X has MGF X (s) = 1/(1 2s)2 . X and Y are independent. Let W = X + Y . a) Find E[Y ], E[Y 2 ], E[X] and E[X 2 ]. b) Find the variance of W . 6. Telephone calls handled by a certain phone company can be either voice (V) or data (D). The company estimates that P [V ] = 0.8 and P [D] = 0.2. All telephone calls are independent of one another. Let X be the number of voice calls in a collection of 100 telephone calls. a) What is E[X]? b) What is VAR[X]? c) Use the CLT to estimate P [X 18]. d) Use the CLT to estimate P [16 X 24]. 7. The duration of a cellular telephone call is an exponential random variable with average length of 4 minutes. A subscriber is charged $30/month for the rst 300 minutes of airtime and $0.25 for each extra minute. A subscriber has received 90 calls during the past month. Use the central limit theorem to answer the following questions:

96

Sums of Random Variables a) What is the probability that this month bill is $30? (meaning that the total airtime is less than or equal to 300 minutes). b) What is the probability that this month bill is more than $35? 8. A random walk in two dimensions is the following process: op a fair coin and move one unit in the +x direction if heads and one unit in the x direction if tails; ip another fair coin and move one unit in the +y direction if heads and one unit in the y direction if tails. This is one cycle. Repeat the cycle 200 times. Let X and Y be the nal position. Let Xk {+1, 1} be the movement along the x axis during the k-th cycle. Let Yk {+1, 1} be the movement along the y axis during the k-th cycle. a) Give the exact PMF of X and nd the exact probability that X exceeds 10 at the end of the process. Find the same probability using the CLT. b) Find the exact probability that both X and Y exceeds 10 at the end of the process. Find the same probability using the CLT. c) Find the probability that the nal position lies outside the circle centered on the origin and goes through (+10, +10).

5.3

Solutions for the Illustrated Problems


e(x+y) ,

1. Given fX,Y (x, y) =

0,

0 x, y , we get that otherwise


ew , 0,

fX,Y (x, w x) = By denition fW (w) =

0w< otherwise

fX,Y (x, w x)dx.


wew , 0, 1, 1,

fW (w) =

0w< otherwise

2.

a) fX (x) =

fX,Y (x, y)dy =

0x1 0, otherwise 0y1 0, otherwise

fY (y) =

fX,Y (x, y)dx =


1,

b) fX (x)fY (y) =

0 x 1, 0 y 1 = fX,Y (x, y) 0, otherwise Therefore, X and Y are independent.

5.3 Solutions for the Illustrated Problems c) Since X, Y Uniform(0, 1) we know that E[X] = E[Y ] = 0.5. E[X + Y ] = E[X] + E[Y ] = 0.5 + 0.5 = 1

97

1 d) Since X, Y Uniform(0, 1) we know that VAR[X] = VAR[Y ] = 12 . Moreover, X and Y are independent ( uncorrelated). Thus we have, 1 1 VAR[X + Y ] = VAR[X] + VAR[Y ] = 12 + 12 = 1 6

e) X and Y are independent random variables. Therefore, the PDF of W = X + Y is the convolution of their PDFs. 0w<1 fW (w) = fX (w) fY (w) = 2 w, 1 w < 2 0, otherwise 3. Given: X N (0, 2). a) Moment generating function of X is given by X (s) = exp(s+s2 2 /2). Dierentiating it 4 times with respect to s, we get:
( ) d4 (X (s)) = exp(s + s2 2 /2) 3 4 + 6( + 2 s)2 2 + ( + 2 s)4 ds4 w,

Thus we have, E[X 4 ] =

b) By denition, E[X 5 ] = x5 fX (x)dx. Since fX (x) is an even function of x, x5 fX (x) is an odd function of x, which makes the integrand evaluate to zero. Therefore, E[X 5 ] = 0 4. Since X, Y and Z are independent, the MGF of W = X + Y + Z is given by: W (s) = X (s) Y (s) Z (s) Substituting W (s) = (5.1), we get:
es
2 /2

d4 ds4

(X (s))

s=0

= 12

(5.1)
1 1s

(es es ) ; 2s(1s)

X (s) = es es es 2s

2 /2

and Y (s) =

in Eqn.

Z (s) =

This MGF can be recognized as that of Z Uniform(1, 1). 5. Given: X (s) = 1/(1 2s)2 and Y (s) = 1/(1 s). a) Using the series expansion: X (s) 1 + 4s + 12s2 + . . . 1 + E[X]s + 1 E[X 2 ]s2 + . . . 2 Y (s) 1 + s + s2 + . . . 1 + E[Y ]s + 1 E[Y 2 ]s2 + . . . 2 Equating the coecients, we get: E[X] = 4; E[X 2 ] = 24; E[Y ] = 1; E[Y 2 ] = 2.

98

Sums of Random Variables b) Since X and Y are independent, VAR[W ] = VAR[X + Y ] = VAR[X] + VAR[Y ] = (E[X 2 ] E 2 [X]) + (E[Y 2 ] E 2 [Y ]) = (24 16) + (2 1) = 9 6. Let X = 100 Xk , where Xk is an indicator random variable which is 1 k=1 whenever the k-th call is a voice call; 0 whenever the k-th call is a data call. x=1 PMF of Xk , k {1, . . . , 100} is given by: PXk = . 0.2, x = 0 2 2 Thus, we have E[Xk ] = E[Xk ] = 0.8, and VAR[Xk ] = E[Xk ] E[Xk ]2 = 0.8 0.82 = 0.16. a) Since, Xk are independent, E[X] = 100 E[Xk ] = 80 k=1 b) Likewise, VAR[X] =
100
k=1

0.8,

VAR[Xk ] = 16.

c) Therefore, CLT can be used to approximate the distribution of X with N (80, 16). ( ) P [X 18] 1 P [X 18] = 1 1880 = (15.5) 1. 16 d) Similarly, P [16 X 24] (16) 0
(
2480 16

1680 16

= (14)

7. Since the duration (airtime) Xk , k {1, . . . , 90} of each call is given to have the distribution Xk Exponential() such that E[Xk ] = 1/ = 4( = 0.25), we know that Xk = 1/ = 4. Let X denote the total airtime (in minutes). Then, we have X = Therefore, E[X] = k=1 E[Xk ] = 90(4) = 360 minutes. Assuming duration Xk of the calls to be independent, 90 2 X = Xk = 4 90 = 37.95 minutes. k=1 a) The distribution of ( can be approximated (using CLT) with Gaussian(360, 1440). X ) P [X 300] 300360 = 1 (1.58) = 0.0571 37.95 b) Similarly, probability that the monthly bill is more than $35 is given by ( ) ( ) 40 P [X > 320] = 1 320360 = 37.95 = 0.8531 37.95 8. a) Consider the movement in x direction. For Xk , the displacement that takes place in the k-th cycle, we get, E[Xk ] = (+1) 1 + (1) 1 = 0. 2 2
90 90
k=1

Xk .

5.3 Solutions for the Illustrated Problems


2 E[Xk ] = (+1)2 1 + (1)2 2 1 2

99

= 1 VAR[Xk ] = 1.

Using the exact distribution of X Let Xk = Xk2+1 , which is Bernoulli(0.5). Thus, X = 200 Xk = ( 1 X + 100) Binomial(200, 0.5). k=1 2 ( ) 200 Therefore, P [X > 10] = P [X > 105] = 0.5200 200 = 0.21838
k=106 k

Hint: you may use the following MATLAB code to compute this value. p = 0; for k = 106:200 p = p + nchoosek(200,k); end p = (0.5^200) * y Using CLT approximation Since, initial displacement is zero, X = 200 Xk . We also know that k=1 the Xk s are independent. E[X] = 200 E[Xk ] = 0, k=1 VAR[X] = 200 VAR[Xk ] = 200. k=1 Thus, the distribution of X can be approximated using the CLT as Gaussian(0, 200).
110 Therefore, P [X > 10] 1 102 = 1 (0.77782) = 0.21834 Note: 11 is used in this approximation, because nal position must be an even number; and 11 is halfway between 10 and 12.

b) Since X and Y are independent and identically distributed, 0.04768, exact P [X > 10, Y > 10] = (P [X > 10])2 = 0.04767, using CLT c) Using the exact distributions Since X and Y are independent, their joint PMF is given by PX,Y (j, k) = ( )( ) 400 200 200 . 0.5 k j We are required to nd P [ X 2 + Y 2] > 10 2]; or equivalently [ P (2X 200)2 + (2Y 200)2 > 200 . This is equivalent to nding the sum of probability masses lying outside the circle. Therefore, the desired probability is: (k,j) PX,Y (j, k) . where: = {(j, k) | j, k {0, . . . , 200}, (2j200)2 +(2k200)2 > 200}. Hint: you may use the following MATLAB code to compute this value to be 0.59978.

100

Sums of Random Variables p = 0; for j = 0:200 for k = 0:200 if (2*j-200)^2 + (2*k-200)^2 > 200 p = p + nchoosek(200,k)* nchoosek(200,j); end end end y = y * (0.5^400) Using CLT approximation Let R = X 2 + Y 2 . Since X, Y Gaussian(0, 200), and X and Y are independent, R is Rayleigh with parameter = 10 2. 2 Therefore, P [R > 10 2] = e(10 2/) /2 = e0.5 = 0.60653.

5.4

Drill Problems
Section 5.1.1 - PDF of the sum of 2 RVs

1. X Uniform(0, 1) and Y Uniform(0, 2) are independent RVs. Compute the PDF fW (w) of W = X + Y . Ans 0w<1 1w<2 fW (w) = 0.5(3 w), 2 w < 3 0, otherwise 2. X Uniform(0, 1) and Y Uniform(0, 2) are independent RVs. Compute the PDF fW (w) of W = X Y . Ans 2 w < 1 1 w < 0 fW (w) = 0.5(1 w), 0 w < 1 0, otherwise 3. X, Y Exponential() are independent RVs. Compute the PDF fW (w) of W =X +Y. Ans fW (w) =
2 wew , 0, 0.5(w + 2), 0.5, 0.5w, 0.5,

w0 otherwise

5.4 Drill Problems

101

Figure 5.1: gure used in Question 7

Section 5.1.2 - Expected values of sums


4. The random variable U has a mean of 0.3 and a variance of 1.5. Ui , i {1, . . . , 53} are independent realizations of U . a) Find the mean and variance of Y if Y = b) Find the mean and variance of Z if Z = Ans a) E[Y ] = 0.3; VAR[Y ] = b) E[Z] = 0.3(53) = 15.9;
1.5 53 1 53 k=i Ui . 53 53 k=i Ui .

= 0.0283

VAR[Z] = 1.5(53) = 79.5

Section 5.1.3 - MGF


5. Compute the moment generating function X (s) = E esX of a random variable X exponentially distributed with a parameter . Let, W = X +Y , where X and Y are Exponential(). Compute the moment generating function W (s) of W using those of X and Y . Ans ( )1 ( )2 s s ; W (s) = 1 X (s) = 1 6. Compute the MGF of an Erlang(n, ) RV. Calculate the mean and variance. Ans )n ( s 1 ; mean = n ; variance =
n 2

7. Find the MGF X (s) of a uniform random variable X Uniform(a, b). a) Derive the mean and variance of X. b) Find E[X 3 ] and E[X 5 ].

102 Ans a) E[X] = a+b ; VAR[X] = (ba) 2 2 5 4 2 3 3 2 4 5 E[X 5 ] = a +ab +a b +a b +ab +b 6


2

Sums of Random Variables

b) E[X 3 ] =

a3 +ab2 +a2 b+ab2 +b3 ; 4

Appendix A 2009 Quizzes


A.1 Quiz Number 1

1. True or False, Justify your answer. a) If A B, and B C, and C A, then A = B = C.

b) A (B C) = (A B) C.

2. Consider the experiment of ipping a coin four times and recording the T and H sequence. For example, THTT is a possible outcome. a) How many elements does the event A = {at least one heads} have?.

b) Let B = {even number of tails}. All outcomes are equally likely. Find P [AorB].

104

2009 Quizzes

A.2

Quiz Number 2

1. a) Using the axioms of probability prove that P [Ac ] = 1 P [A].

b) It is known that P [A

B] = 0.24, P [A] = 0.15, P [B] = 0.18. Find P [A|B].

2. Events D, E and F form an event space. Calculate P [F |A]. P [D] = 0.35 P [A|D] = 0.4 P [E] = 0.55 P [A|E] = 0.2 P [F ] = 0.10 P [A|F ] = 0.3

A.3 Quiz Number 3

105

A.3

Quiz Number 3

1. From a group of ve women and seven men, two women and three men are randomly selected for a committee? a) How many ways can the committee be selected?

b) Suppose that two of the men (say, John and Tim) cannot serve in the committee together. What is the probability that a randomly selected committee meets this requirement?

2. In a lot of 100 used computers, 18 have faulty hard drives and 12 have faulty monitors. Assume that these two problems are independent. If a computer chosen at random, nd the probability that (a) it has a hard disc problem, (b) it does not have a faulty monitor, (c) it has a hard disc problem only.

106

2009 Quizzes

A.4

Quiz Number 4

1. A fair die is rolled twice, and the two scores are recorded. The random variable X is 1 if both scores are equal. If not, X is the minimum of the two scores. For example, if the rst score is 3 and the second one is 5, then X = 3, but if both are 3, then X = 1. (a) Write SX , the range, and PX (x), the PMF of X. Be sure to write the value of PX (x) for all x from to .

(b) Find the probability of X > 3.

2. (a) Two percent of the resistors manufactured by a company are defective. You need 23 good resistors for a project. Suppose you have a big box of the resistors and you keep on picking resistors until you have 23 good ones. Let X be the total number of resistors that you pick. Write down the PMF of X.

(b) A student takes a multiple choice test with 20 questions. Each question has 5 answers (only one of which is correct). The student blindly guesses. Let X be the number of correct answers. Find the PMF of X.

A.5 Quiz Number 5

107

A.5

Quiz Number 5

1. The CDF of a random variable is given as


0 4
x 16

FX (x) = a) Find P [1 < X < 2].

x0 0<x<2 2x

b) Find the PDF of X, fX (x).

2. The PDF of a random variable is given as


{

fX (x) = a) Find FX (0.5).

x+ 0

1 2

if 0 < x < 1 Otherwise

b) A PDF is given by
{

fY (y) = Find the value of the constant c.

cy 2 0
1

0<y<1 otherwise

108

2009 Quizzes

A.6

Quiz Number 6

1. X is a continuous Uniform(2, +2) random variable. a) Find E[X 3 ].

b) Find E[eX ].

2. The number of sales a retailer has is modelled as X Gaussian(50, 100). Considering the overhead costs and the cost of products, the net prot of the retailer is Y = X 5. 5 a) Find E[Y ] and V ar[Y ].

b) Find the probability of a positive net prot, i.e., P [Y 0]. Leave your answer in the form of (x) function.

3. Telephone calls arrive at a switchboard at the average rate of 2 per hour. You can assume that the time between two calls is an exponential random variable. Find the probability that it will be at least 3 hours between two calls?

A.7 Quiz Number 7

109

A.7

Quiz Number 7

X is a Uniform(2,6) random variable and event B = {X < 3}. (a) Find fX|B (x), E[X|B] and VAR[X|B]

(b) Suppose Y = g(X) = cases for < a < .

1 . X

Find the CDF of Y , FY (a). Be sure to consider all

110

2009 Quizzes

A.8

Quiz Number 8
{

1. Random variables X and Y have joint PDF fX,Y (x, y) = Let W =


2X . Y

2 0

if 0 y x 1 otherwise

(a) What is the range of W.

(b) Find FW (a). Be sure to consider all cases for < a < .

(c) Find P [X 2Y ].

Appendix B 2009 Quizzes: Solutions


B.1 Quiz Number 1

Quiz # 1, EE 387

1. True or False, Justify your answer. a) If A B, and B C, and C A, then A = B = C. Solution: True. Since A B and B C, one can conclude that A C. Moreover, one has C A. Consequently, A = C. A similar discussion holds for B. C A and A B, therefore C B, and since B C, one has B = C = A. b) A (B C) = (A B) C. Solution: False. It can be shown easily using a counterexample. If A = {1, 2, 3, 4}, B = {1, 2, 3}, and C = {1}, A (B C) = A {2, 3} = {1, 4} while (A B) C = {4} C = {4}.

2. Consider the experiment of ipping a coin four times and recording the T and H sequence. For example, THTT is a possible outcome. a) How many elements does the event A = {at least one heads} have?.

112

2009 Quizzes: Solutions

Solution: A={at least one H} is equivalent to the set of all outcomes except for the TTTT. Since there are 2 2 2 2 = 16 outcomes in S, A has 15 elements. b) Let B = {even number of tails}. All outcomes are equally likely. Find P [AorB].

Solution: B has 8 elements (0 is an even number), i.e. B = {HHT T, HT HT, HT T H, T HHT, T HT H, T T HH, T T T T, HHHH}. A includes all the elements of B except for TTTT and therefore A B = {HHT T, HT HT, HT T H, T HHT, T HT H, T T HH, T T T T, HHHH} A B = S has 16 elements.

B.2 Quiz Number 2

113

B.2

Quiz Number 2

1. a) Using the axioms of probability prove that P [Ac ] = 1 P [A]. Solution : Using axiom 3, one has, A Ac = P [A Ac ] = P [A] + P [Ac ] Moreover, using axiom 2, A Ac = S P [A Ac ] = P [S] = 1 Therefore, 1 = P [A] + P [Ac ] P [Ac ] = 1 P [A] b) It is known that P [A Solution : P [A B] P [B] P [A B] = P [A] + P [B] P [A B] = 0.15 + 0.18 0.24 = 0.09 0.09 P [A|B] = = 0.5 0.18 P [A|B] = 2. Events D, E and F form an event space. Calculate P [F |A]. P [D] = 0.35 P [A|D] = 0.4 P [E] = 0.55 P [A|E] = 0.2 P [F ] = 0.10 P [A|F ] = 0.3 Solution :

B] = 0.24, P [A] = 0.15, P [B] = 0.18. Find P [A|B].

P [A] = P [D]P [A|D] + P [E]P [A|E] + P [F ]P [A|F ] = 0.14 + 0.11 + 0.03 = 0.28 P [A|F ]P [F ] 0.03 P [F |A] = = = 0.107 P [A] 0.28

114

2009 Quizzes: Solutions

B.3

Quiz Number 3

1. From a group of ve women and seven men, two women and three men are randomly selected for a committee? a) How many ways can the committee be selected?
( )( )
5 2 7 3

Solution: Number of ways =

= 350

b) Suppose that two of the men (say, John and Tim) cannot serve in the committee together. What is the probability that a randomly selected committee meets this requirement? Solution: The total number of ways that the committee does not meet the requirement is equivalent to the number of ways that both John and Tim are in the committee. So only one other man should be selected for the committee out of 5 men. If we call the event that the committee meets the requirement, A,
( )( )
5 2 5 1

P r[A] = 1

350

300 350

One can also nd the same result using the total number of ways that the committee meets the requirement.We can divide the men group into two groups, one group which only has two members (John and Tim) and one group which includes all other possible members (has 5 members). The committee meets the requirement if one of the members is selected out of the rst group and two other members are selected out of the second group or if {all the members of the committee are selected out of the second group},
( ) ( )( )

P r[A] =

5 2

2 1

5 2

( )( )
2 0 5 3

350

300 350

2. In a lot of 100 used computers, 18 have faulty hard drives and 12 have faulty monitors. Assume that these two problems are independent. If a computer chosen at random, nd the probability that (a) it has a hard disc problem, (b) it does not have a faulty monitor, (c) it has a hard disc problem only. Solution:

B.3 Quiz Number 3 (a) P r[A] = 0.18 (b) P r[B c ] = 1 0.12 = 0.88 (c) P r[A B c ] = P [A]P [B c ] = 0.18 0.88 = 0.1584

115

116

2009 Quizzes: Solutions

B.4

Quiz Number 4

1. A fair die is rolled twice, and the two scores are recorded. The random variable X is 1 if both scores are equal. If not, X is the minimum of the two scores. For example, if the rst score is 3 and the second one is 5, then X = 3, but if both are 3, then X = 1. (a) Write SX , the range, and PX (x), the PMF of X. Be sure to write the value of PX (x) for all x from to . Solution: a) SX = {1, 2, 3, 4, 5}. In the following equations, {nm} means {dice1= n & dice2= m}. PX (x = 1) = P {11, 22, 33, 44, 55, 66, 12, 21, 13, 31, 14, 41, 15, 51, 16, 61} = PX (x = 2) = P {23, 32, 24, 42, 25, 52, 26, 62} = PX (x = 3) = P {34, 43, 35, 53, 36, 63} = PX (x = 4) = P {45, 54, 46, 64} = PX (x = 5) = P {56, 65} = 2 36 4 36 6 36 8 36 16 36

Note that PX (x) = 0 if x is not a member of SX = {1, 2, 3, 4, 5}. (b) Find the probability of X > 3. Solution: b) PX (x > 3) =

2+4 36

6 36

2. (a) Two percent of the resistors manufactured by a company are defective. You need 23 good resistors for a project. Suppose you have a big box of the resistors and you keep on picking resistors until you have 23 good ones. Let X be the total number of resistors that you pick. Write down the PMF of X. Solution: Using the information given, the xth resistor must be a good one. Moreover, 22 resistors out of x 1 resistors must be good too. Therefore: x1 PX (x) = (0.98)23 (0.02)x122 22
( )

B.4 Quiz Number 4

117

You could also argue that X has a Pascal(23, 0.02) distribution and get the exact same result immediately. (b) A student takes a multiple choice test with 20 questions. Each question has 5 answers (only one of which is correct). The student blindly guesses. Let X be the number of correct answers. Find the PMF of X. Solution: It can be easily seen that it has a binomial distribution. 20 PX (x) = (0.2)x (0.8)20x x
( )

118

2009 Quizzes: Solutions

B.5

Quiz Number 5

1. The CDF of a random variable is given as


0 4

FX (x) =

x 16

if x 0 if 0 < x < 2 if 2 x

a) Find P [1 < X < 2]. Solution: P [1 < X < 2] = FX (2) FX (1) = 1 1 15 = 16 16

b) Find the PDF of X, fX (x). Solution: d FX (x) = fX (x) = dx


{
x3 4

,0 < x < 2 , otherwise

2. The PDF of a random variable is given as


{

fX (x) = a) Find FX (0.5). Solution: FX (0.5) = P [X 0.5] =

x+ 0

1 2

if 0 < x < 1 Otherwise

0.5

x+

x2 1 3 1 dx = + x = 2 2 2 0 8

0.5

b) A PDF is given by fY (y) = Find the value of the constant c. cy 2 0 < y < 1 0 otherwise
1

B.5 Quiz Number 5 Solution: Using the fact that

0 1

119

fY (y)dy = 1,
1 1

cy 2 dy = 2cy 2 c = 1 2

1 0

= 2c = 1

120

2009 Quizzes: Solutions

B.6

Quiz Number 6
1. X

is a continuous Uniform(2, +2) random variable. a) Find E[X 3 ]. Solution: 2 x 2 0, otherwise 2 1 3 E[X 3 ] = x dx = 0 (from odd symmetry of the integrand) 2 4
1 , 4

X Uniform(2, +2) fX (x) =

b) Find E[eX ]. Solution:

E[e ] =

1 x ex e dx = 4 4

2 2

e2 e2 = 1.81343 = 4

2. The number of sales a retailer has is modeled as X Gaussian(50, 100). Considering the overhead costs and the cost of products, the net prot of the retailer is Y = X 5. 5 a) Find E[Y ] and V ar[Y ]. Solution: Given E[X] = 50, Var[X] = 100, E[Y ] = E[X] 5 = 5 5 Var[Y ] = Var[X] = 4 52 b) Find the probability of a positive net prot, i.e., P [Y 0]. Leave your answer in the form of () function. Solution: P [Y 0] = P
[
Y 5 2

2.5 = 1 (2.5) = (2.5) = 0.99379

B.6 Quiz Number 6

121

3. Telephone calls arrive at a switchboard at the average rate of 2 per hour. You can assume that the time between two calls is an exponential random variable. Find the probability that it will be at least 3 hours between two calls? Solution: Given average call arrival rate 2 per hour (i.e. average inter-arrival time of 0.5 hours), we nd that inter-arrival time T Exponential(2). P [T > 3] = e3
=2

= e6 = 2.4787 103

122

2009 Quizzes: Solutions

B.7

Quiz Number 7

X is a Uniform(2,6) random variable and event B = {X < 3}. (a) Find fX|B (x), E[X|B] and VAR[X|B] Solution: By denition, fX|B (x) =
{

fX (x) , P [B]

0,

when B is true when B is false

Theres no need to compute P [B] since the above equation hints that X|B Uniform(2,3).
{

fX|B (x) = E[X|B] =

1, 2 x < 3 0, otherwise

2+3 = 2.5 2 (3 2)2 1 Var[X|B] = = = 0.0833 12 12


1 . X

(b) Suppose Y = g(X) = cases for < a < .

Find the CDF of Y , FY (a). Be sure to consider all

Solution: By denition of the CDF, for < a < .


[ [ P X 1 a = [ X P X ]
1 a 1 a

] ]

FY (a) = P [Y a] = P

, a>0 , a<0

( ) 1 FX 1 , a > 0 a = ( ) 1 FX , a<0 a 0, x<2

But, we know that FX (x) = Hence, we get


0,

x2 , 4

1,

2x6 . x>6 a< 1 6 1 6a1 , 6 a = 4a 1, a> 1 2


0,

a< 1 6 1 2 FY (a) = 1 a 4 , 1 a 6 1, a> 1 2

1 2

1 2

B.8 Quiz Number 8

123

B.8

Quiz Number 8
{

1. Random variables X and Y have joint PDF fX,Y (x, y) = Let W =


2X . Y

2 0

if 0 y x 1 otherwise

(a) What is the range of W. Solution: Consider w =


2x , y

the relationship a particular instantiation w of RV

W has with corresponding instantiations: x, y of X and Y . Since y x, w is minimal when x = y. Thus, min(w) = 2. Maximum value of w is found when y 0 while x = 0. Therefore, max(w) . Being a smooth function of x and y, w exists for all intermediate values. Thus, the range of W is [2, ). (b) Find FW (a). Be sure to consider all cases for < a < . Solution:
Y
(1, 1)

y=x y = 2x/a (for: a 2)


(0, 0)

Figure B.1: region of integration (for case: a 2)

Since x and y are non-negative, FW (a) = 0 whenever a < 0. Consider a 0 case. By denition, [ ] [ ] 0, 0a<2 2X 2X FW (a) = P [W a] = P a =P Y = 1 x Y a x=0 y= 2x fX,Y (x, y)dydx, a 2
0, = 1 (
x y= 2x a

x=0

2dy dx,

a 0, 0a<2 ) = ( 1 1 2 a2 x=0 2xdx, a

0 a < 2 0, 0 = 2 1 , a a2 a

124 Therefore, FW (a) =


0, 1 2 , a

2009 Quizzes: Solutions a<2 . a2

(c) Find P [X 2Y ]. Solution: P [X 2Y ] = P [ X 2] = P [W = Y


2X Y

4] = FW (4) = 1

2 4

1 2

Appendix C 2010 Quizzes


C.1 Quiz Number 1

1. [2 marks] Use algebra of sets to prove (A B) C = A (B C).

2. A fair die is rolled twice and the sum is recorded. a) [1 mark] Give the sample space (S) of this experiment.

b) [1 mark] Are the outcomes of this sample space equally likely? Explain.

c) [1 mark] Let B={sum is less than or equal to 2}. Find P [B].

3. In a company 40% of employees are female. Also, 15% of the male (M) employees and 10% of female (F) employees hold managerial positions. a) [2 marks] Let A be the event that a randomly selected employee of this company holds a managerial position. Find P [A]? b) [1 mark] In part (a), what is the probability that the employee does not have a managerial position?

c) [2 marks] A randomly selected employee is found to have a managerial position. What is the probability that this person is female?

126

2010 Quizzes

C.2

Quiz Number 2

1. [5 marks] Events A and B are independent and events A and C are disjoint (mutually exclusive). Let P [A] = 0.2, P [B] = 0.4, and P [C] = 0.1. Please answer the following parts: a) [1 mark] Find P [A B].

b) [1 mark] Find P [A|B].

c) [1 mark] Find P [Ac B].

d) [1 mark] Find P [A|C].

e) [1 mark] Find P [A B C].

2. For this question, you may leave your answers as ratios of

( )
n k

terms.

From a class of 20 boys and 10 girls a team of 5 is selected. a) [1 mark] Find the probability that the team consists of 2 boys and 3 girls.

b) [2 marks] Find the probability that the majority of the team members are girls.

c) [2 mark] There are 6 students in this class, that do not like to be in the team. Find the probability that the randomly chosen team has none of these 6 students.

C.3 Quiz Number 3

127

C.3

Quiz Number 3

1. A discrete random variable X has the following probability mass function (PMF) A x = 4 A x = 1 PX (x) = 0.3 x = 0 0.3 x = 4 0 otherwise. (a) (1 mark) Find A.

(b) (3 marks) Sketch the PMF. Find FX (0.5), where FX () represents the CDF of X. (c) (1 marks) Find P [0.5 < X 3].

(d) (1 mark) Find P [X > 2].

2. (4 marks) A biased coin with P [T ] = 0.2 and P [H] = 0.8 is tossed repeatedly. Identify the type of the random variable (for example, X Binomial(10,0.1)) in each of the following cases. a) X is the number of tosses before the rst H (inclusive).

b) X is the number of tosses before the third T.

c) X is the number of heads (H) in 5 tosses.

d) After the occurrence of the rst H, X is the number of extra tosses before the second H (inclusive).

128

2010 Quizzes

C.4

Quiz Number 4

1. The CDF of a random variable is given as


0 2
x 9

FX (x) = a) [2 marks] Find P [1 < X < 2].

if x 0 if 0 < x < 3 if 3 x

b) [2 marks] Find P [X > 1].

c) [3 marks] Find the pdf of X.

d) [3 marks] Find E[X].

C.5 Quiz Number 5

129

C.5

Quiz Number 5

1. The lifetime of a transistor in years is modeled as Exponential(0.2). Answer the following: (a) [1 mark] Find the average lifetime.

(b) [2 marks] Find the probability that it lasts longer than 3 years.

(c) [2 marks] Given that it has lasted for 5 years, what is the probability that it lasts for another 3 years.

(d) [1 mark] An EE has designed a circuit using two of above-mentioned transistors such that one is active and the second one is the spare (i.e., the second one becomes active when the rst one dies). The circuit can last until both transistors are dead. What random variable can model the lifetime of this circuit? Give the parameters of its PDF.

2. For a Uniform(1,3)[ random variable ] 1 (a) [2 marks] Find E X 2 .

(b) [2 marks] Find E[4X 5].

130

2010 Quizzes

C.6

Quiz Number 6

1. X is a Gaussian random variable with = 8, 2 = 16. Answer the following (leave your answers in () form with positive argument). (a) [1 mark] Find P [12 < X < 16].

(b) [2 marks] Find P [X > 0].

(c) [2 marks] Dene Y = X/4 + 6. Find, the average and variance of Y.

(d) [1 mark] Dene W = AX + B. Find A and B such that W is a Gaussian with mean zero and variance 4.

2. Consider X Exponential(2) and Y = X 3 . (a) [1 mark] Find the range of Y.

(b) [3 marks] Find the PDF of Y .

C.7 Quiz Number 7

131

C.7

Quiz Number 7

1. The joint pdf of two random variable is given as


{

fX,Y (x, y) = [2 marks] Find c.

cx 0 y/2 x 1 0 elsewhere

[2 marks] Find the marginal PDF of X, fX (x). Be sure to consider the whole range < x < .

[2 marks] Find the marginal PDF of Y , fY (y). Be sure to consider the whole range < y < .

[4 marks] Find the PDF of Z = < a < .

Y , X

fZ (a). Be sure to consider the whole range

Appendix D 2010 Quizzes: Solutions


D.1 Quiz Number 1

1. [2 marks] Use algebra of sets to prove (A B) C = A (B C). Solution: (AB)C = (AB c )C c = A(B c C c ) = A(B C)c = A(B C)

2. A fair die is rolled twice and the sum is recorded. a) [1 mark] Give the sample space (S) of this experiment. Solution: S = {2, 3, . . . , 12} b) [1 mark] Are the outcomes of this sample space equally likely? Explain. Solution: No. Some outcomes are more likely than others. For example 2 can just be the outcome when both die rolls result in 1 (i.e., (x1 , x2 ) = (1, 1)), while 7 is the outcome of the experiment when any of the pairs (1, 6), (2, 5), (3, 4), (4, 3), (5, 2), or (6, 1) happens (i.e., (x1 , x2 ) {(1, 6), (2, 5), (3, 4), (4, 3), (5, 2), (6, 1)}). c) [1 mark] Let B={sum is less than or equal to 2}. Find P [B]. Solution: Sum less than or equal to 2, means both die rolls should have resulted in 1. Hence we have, P [B] = P {(x1 , x2 ) = (1, 1)} = P (x1 = 1) P (x2 = 1) = 1 1 1 = 36 . Note that the die rolls are independent. 6 6

134

2010 Quizzes: Solutions

3. In a company 40% of employees are female. Also, 15% of the male (M) employees and 10% of female (F) employees hold managerial positions. a) [2 marks] Let A be the event that a randomly selected employee of this company holds a managerial position. Find P [A]? Solution: P [A] = P [A|F ]P [F ] + P [A|F c ]P [F c ] = 0.1 0.4 + 0.15 0.6 = 0.13 b) [1 mark] In part (a), what is the probability that the employee does not have a managerial position? Solution: P [Ac ] = 1 P [A] = 1 0.13 = 0.87 c) [2 marks] A randomly selected employee is found to have a managerial position. What is the probability that this person is female? Solution: P [F |A] =
P [A|F ]P [F ] P [A]

0.10.4 0.13

4 13

0.3077

D.2 Quiz Number 2

135

D.2

Quiz Number 2

1. [5 marks] Events A and B are independent and events A and C are disjoint (mutually exclusive). Let P [A] = 0.2, P [B] = 0.4, and P [C] = 0.1. Please answer the following parts: a) [1 mark] Find P [A B]. Solution: P [A B] = P [A] + P [B] P [A B] = P [A] + P [B] P [A]P [B] = 0.2 + 0.4 0.08 = 0.52 b) [1 mark] Find P [A|B]. Solution: P [A|B] =
P [AB] P [B]

P [A]P [B] P [B]

0.20.4 0.4

0.08 0.4

= 0.2

c) [1 mark] Find P [Ac B]. Solution: P [Ac B] = P [Ac ] + P [B] P [Ac B] = (1 P [A]) + P [B] (1 P [A])P [B] = (1 0.2) + 0.4 (1 0.2) 0.4 = 0.88 d) [1 mark] Find P [A|C]. Solution: P [A|C] =
P (AC) P [C]

P [] P [C]

=0

e) [1 mark] Find P [A B C]. Solution: P [A B C] = P [(A C) B] = P [ B] = P [] = 0 2. For this question, you may leave your answers as ratios of n terms. k From a class of 20 boys and 10 girls a team of 5 is selected. a) [1 mark] Find the probability that the team consists of 2 boys and 3 girls. Solution: (20)(10) 2 3 (30) 5
( )

b) [2 marks] Find the probability that the majority of the team members are girls. Solution: (20)(10)+(20)(10)+(20)(10) 2 3 1 4 0 5 (30) 5

c) [2 mark] There are 6 students in this class, that do not like to be in the team. Find the probability that the randomly chosen team has none of these 6 students. Solution: (306) 5 (30) 5

136

2010 Quizzes: Solutions

D.3

Quiz Number 3

1. A discrete random variable X has the following probability mass function (PMF) A x = 4 A x = 1 0.3 x = 0 PX (x) = 0.3 x = 4 0 otherwise. (a) (1 mark) Find A. Solution: 0.3 + 0.3 + A + A = 1 A = 0.2 (b) (3 marks) Sketch the PMF. Find FX (0.5), where FX () represents the CDF of X.

Solution: Fx (0.5) = 0.2 + 0.2 + 0.3 = 0.7 (c) (1 marks) Find P [0.5 < X 3]. Solution: No mass P [0.5 < X 3] = 0 (d) (1 mark) Find P [X > 2]. Solution: P [X > 2] = 0.3 2. (4 marks) A biased coin with P [T ] = 0.2 and P [H] = 0.8 is tossed repeatedly. Identify the type of the random variable (for example, X Binomial(10,0.1)) in each of the following cases. a) X is the number of tosses before the rst H (inclusive).

D.3 Quiz Number 3 Solution: Geometric(0.8) b) X is the number of tosses before the third T.

137

Solution: Pascal(3,0.2) c) X is the number of heads (H) in 5 tosses.

Solution: Binomial(5,0.8) d) After the occurrence of the rst H, X is the number of extra tosses before the second H (inclusive).

Solution: Geometric(0.8)

138

2010 Quizzes: Solutions

D.4

Quiz Number 4

1. The CDF of a random variable is given as


0 2
x 9

FX (x) = a) [2 marks] Find P [1 < X < 2].

if x 0 if 0 < x < 3 if 3 x

Solution: P [1 < X < 2] = FX (2) FX (1) =

22 9

12 9

3 9

1 3

b) [2 marks] Find P [X > 1]. Solution: P [X > 1] = 1 P [X 1] = 1 FX (1) = 1


1 9

8 9

c) [3 marks] Find the pdf of X.


{
2x 9

Solution: fX (x) =

0<x<3 otherwise

d) [3 marks] Find E[X]. Solution: E[X] =


+

xfX (x)dx =

3
0

x 2x dx = 9

2x3 3 | 27 0

=2

D.5 Quiz Number 5

139

D.5

Quiz Number 5

1. The lifetime of a transistor in years is modeled as Exponential(0.2). Answer the following: (a) [1 mark] Find the average lifetime. Solution:
1

1 0.2

=5

(b) [2 marks] Find the probability that it lasts longer than 3 years. Solution: P [T > 3] = e3 = e0.6

(c) [2 marks] Given that it has lasted for 5 years, what is the probability that it lasts for another 3 years. Solution: P [T > 8|T > 5] = P [T > 3] = e3 = e0.6

(d) [1 mark] An EE has designed a circuit using two of above-mentioned transistors such that one is active and the second one is the spare (i.e., the second one becomes active when the rst one dies). The circuit can last until both transistors are dead. What random variable can model the lifetime of this circuit? Give the parameters of its PDF. Solution: Erlang(2, 0.2) 2. For a Uniform(1,3)[ random variable ] 1 (a) [2 marks] Find E X 2 . Solution:
3
1 1 x2

1 dx = 2

x1 3 |1 2

1 2

1 6

1 3

(b) [2 marks] Find E[4X 5]. Solution: E[4X 5] = 4 E[X] 5 = 4 2 5 = 3

140

2010 Quizzes: Solutions

D.6

Quiz Number 6

1. X is a Gaussian random variable with = 8, 2 = 16. Answer the following (leave your answers in () form with positive argument). (a) [1 mark] Find P [12 < X < 16]. Solution: P [12 < X < 16] = P [ 128 < Z < 4 (b) [2 marks] Find P [X > 0]. Solution: P [X > 0] = 1 P [X 0] = 1 P [Z
08 ] 4 168 ] 4

= (2) (1)

= 1 (2) = (2)

(c) [2 marks] Dene Y = X/4 + 6. Find, the average and variance of Y. Solution: E[Y ] = VAR[Y ] =
1 16 1 4

E[X] + 6

E[Y ] = 8

VAR[X]

VAR[Y ] = 1

(d) [1 mark] Dene W = AX + B. Find A and B such that W is a Gaussian with mean zero and variance 4. Solution: VAR[W ] = A2 VAR[X] = 4 E[W ] = A E[X] + B = 0 A=
1 2

B = 4

2. Consider X Exponential(2) and Y = X 3 . (a) [1 mark] Find the range of Y. Solution: Since the range of X X 0, the range of Y is Y 0

(b) [3 marks] Find the PDF of Y . Solution: fY (y) = fY (y) = 2 y 2/3 e2y 3
fX (x) , |g (x)| x=g 1 (y)
1/3

where g(x) = x3 , g (x) = 3x2 and g 1 (y) = y 1/3 .

for y 0

D.7 Quiz Number 7

141

D.7

Quiz Number 7

1. The joint pdf of two random variable is given as


{

fX,Y (x, y) = [2 marks] Find c.

cx 0 y/2 x 1 0 elsewhere

2x

1 0

Solution:
0

cx dydx =

2cx2 dx = 1

2c =1 3

c=

3 2

[2 marks] Find the marginal PDF of X, fX (x). Be sure to consider the whole range < x < .

2x

Solution: fX (x) = 0 0,

3 x dy = 3x2 , 0 < x < 1 2 otherwise.

[2 marks] Find the marginal PDF of Y , fY (y). Be sure to consider the whole range < y < .

1

Solutions: fY (y) =

y/2

0,

3 3 3y 2 x dx = , 0<y<2 2 4 16 otherwise.
Y , X

[4 marks] Find the PDF of Z = < a < .

fZ (a). Be sure to consider the whole range

0 1

Solution: For 0 < a < 2, FZ (a) = P [Z a] = P [Y aX] = a . 2 For a 2, FZ (a) = 1 and for a 0, FZ (a) = 0. 0, a 0 1 , 0<a<2 a FZ (a) = 2 , 0 < a < 2 fZ (a) = 2 0, otherwise. 1, a 2.

ax

3 x dydx = 2

Appendix E 2011 Quizzes


E.1 Quiz Number 1
First Name:

Quiz #1, EE 387, Time: 15 min, Last Name:

1. [2 marks] Let P [A] = P [B] = .2 and P [A B] = .3. Find P [(A B)c ]. 2. [2 marks] Let P [A] = 0.2, P [B] = 0.3 and A and B are mutually exclusive. Find P [A|B c ]. 4. Consider couples adhering to the following family-planning policy. Each couple will stop when they have a child of each sex, or stop when they have 3 children. Consider a collection of such families and use the notation: B=boy and G=girl. You pick such a family and observe the kids in that family. For example, one possible outcome is GB (ie younger girl and older boy). a) [1 mark] Give the sample space (S) of this experiment. b) [1 mark] Are all outcomes in S equally likely? Briey justify your answer. 5. In a cosmetic product store 30% of products are for males and 70% for females. Also, 50% of the male products are hair-care products, whereas 20% of female products are hair-care products. a) [2 marks] Let A be the event a randomly selected product of this store is a hair-care product. Find P [A]? c) [2 marks] A randomly selected product is observed not to be a hair-care product. What is the probability that it is a male product?

144

2011 Quizzes

E.2

Quiz Number 2
First Name:

Quiz #2, EE 387, Time: 15 min, Last Name:

1. [4 marks] Events A and B are independent and events A and C are disjoint (mutually exclusive). Let P [A] = 0.2, P [B] = 0.4, and P [C] = 0.1. Please answer the following parts: a) [1 mark] Find P [A B c ]. b) [1 mark] Find P [Ac |B]. d) [1 mark] Find P [A C|B]. e) [1 mark] Find P [A B C c ]. 2. [2 mark] A student goes to EE387 class on a snowy day with probability 0.4, but on a nonsnowy day attends with probability 0.7. Suppose that 20% of the days in March are snowy in Edmonton. What is the probability that it snowed on March 10 given that the student was in class on that day? 3. For this question, you may leave your answers as ratios of
( )
n k

terms.

From a class of 20 boys and 10 girls a team of 5 is randomly selected for a competition. a) [1 mark] Find the probability that the team consists of at least one boy and one girl.

b) [1 mark] Find the probability that the team has an odd number of girls.

c) [1 mark] Three students in this class cannot attend the competition. Find the probability that the randomly chosen team has none of these 3 students.

E.3 Quiz Number 3

145

E.3

Quiz Number 3
First Name:

Quiz #3, EE 387, Time: 15 min, Last Name:

1. A discrete random variable X has the following probability mass function (PMF) A x = 2, 1 PX (x) = A/2 x = 0, 1 0 otherwise. (a) (1 mark) Find A. (b) (1 mark) Sketch the PMF. (c) (1 marks) Find P [0.5 < X 3].
1 2. (3 marks) The PMF of random variable X is given by PX (x) = 3 for x = 1, 2, 3 and PX (x) = 0 for all other x. Derive the CDF FX (a) = P [X a] for all values of a. Sketch the CDF.

3. (4 marks) Mary writes a 20 multiple-choice examination on Chemistry 101. Each question has 5 answers. Because she has skipped many lectures, she must take random guesses. Suppose X is the number of questions that she gets right. a) (2 marks) Write down the PMF of X. b) (1 mark) What is the probability that she gets 19 or more questions right? c) (1 mark) What is the probability that she gets no questions right?

146

2011 Quizzes

E.4

Quiz Number 4
First Name:

Quiz #4, EE 387, Time: 15 min, Last Name: 1. A discrete random variable X has (PMF) 0.1 0.2 PX (x) = 0.3 0

the following probability mass function x = 3, 2, 1 x = 0, 1 x=2 otherwise.

(a) (2 marks) Find the PMF of W = |X|. (b) (2 marks) Find the PMF of Y = X 2 + 1. (c) (2 marks) Find the PMF of Z = 1 X. (d) (2 marks) Find E[X 2 + 1]. (e) (2 marks) A fair coin is tossed three times. Let X be the number of Heads. Find E[X], the expected value of X.

E.5 Quiz Number 5

147

E.5

Quiz Number 5
First Name:

Quiz #5, EE 387, Time: 15 min, Last Name:

1. The wealth of an individual in a country is related to the continuous random variable X, which has the following cumulative distribution function (CDF)

FX (x) =

1 0

1 1x< x4 x 1.

(a) (2 marks) Derive the PDF of X (b) (2 marks) Find the mean of X, E[X]. (c) (2 marks) Find the mean square of X, E[X 2 ]. Find the variance of X. (d) (2 marks) Suppose the wealth measured in dollars is given by Y = 10000X + 2000. Find the mean wealth E[Y ] and standard deviation STD[Y ]. (e) (2 marks) What is the percentage of the individuals whose income exceed 22000$?

148

2011 Quizzes

E.6

Quiz Number 6
First Name:

Quiz #6, EE 387, Time: 15 min, Last Name:

1. X is a Gaussian random variable with = 10, 2 = 4. Answer the following (leave your answers in () form with positive argument). (a) [2 marks] Find P [12 < X < 16]. (b) [1 mark] Find P [X > 0]. 2. The time T in minutes between two successive bus arrivals in a bus stop is Exponential ( 0.2). (a) [ 1 mark] When you just arrive at the bus stop, what is the probability that you have to wait for more than 5 minutes? (b) [1 mark] What is the average value of your waiting time? (c) [ 1 mark] You are waiting for a bus, and no bus has arrived in the past 2 minutes. You decide to go to the adjacent coee shop to grab a coee. It takes you 5 minutes to grab your coee and be back at the bus station. Determine the probability that you will not miss the bus. 3. [ 4 marks] You borrow your friends car to drive to Hinton to see your signicant other. The driving distance is 100 km. The gas gauge is broken, so you dont know how much gas is in the car. The tank holds 40 liters and the car gets 15 km per liter, so you decided to take a chance. (a) [2 marks] Suppose X is the distance (km) that you can drive until the car runs out of gas. Out of Uniform, Exponential and Gaussian PDFs, which one is most suitable for modeling X? Briey justify your choice. Use your choice with the appropriate parameters to answer the following questions. (b) [ 1 mark] What is the probability that you make it to Hinton without running out of gas? (c) [1 mark] If you dont run out of gas on the way, what is the probability that you will not run out of gas on the way back if you decide to a take chance again?

Appendix F 2011 Quizzes: Solutions


F.1 Quiz Number 1
First Name:

Quiz #1, EE 387, Time: 15 min, Last Name:

1. [2 marks] Let P [A] = P [B] = .2 and P [A B] = .3. Find P [(A B)c ].

Solution: P [A B] = P [A] + P [B] P [A B] = 0.2 + 0.2 0.3 = 0.1 P [(A B)c ] = 1 P [A B] = 0.9.

2. [2 marks] Let P [A] = 0.2, P [B] = 0.3 and A and B are mutually exclusive. Find P [A|B c ].

Solution: P [A B c ] P [A] 0.2 = = . c] c] P [B P [B 0.7 Note that P [A] = P [A B] + P [A B c ] and P [A B] = 0.

4. Consider couples adhering to the following family-planning policy. Each couple

150

2011 Quizzes: Solutions

will stop when they have a child of each sex, or stop when they have 3 children. Consider a collection of such families and use the notation: B=boy and G=girl. You pick such a family and observe the kids in that family. For example, one possible outcome is GB (ie younger girl and older boy). a) [1 mark] Give the sample space (S) of this experiment.

Solution: S = {GB, GGB, GGB, GGG, BG, BBG, BBB}.

b) [1 mark] Are all outcomes in S equally likely? Briey justify your answer.

Solution: No. Clearly GB is more likely than GGB. 5. In a cosmetic product store 30% of products are for males and 70% for females. Also, 50% of the male products are hair-care products, whereas 20% of female products are hair-care products. a) [2 marks] Let A be the event a randomly selected product of this store is a hair-care product. Find P [A]? Solution: P [A] = P [A|M ]P [M ] + P [A|F ]P [F ] = 0.5 0.3 + 0.2 0.7 = 0.29.

c) [2 marks] A randomly selected product is observed not to be a hair-care product. What is the probability that it is a male product?

Solution: P [M |Ac ] = . P [Ac M ] 0.5 0.3 0.15 = = c] P [A 1 0.29 0.71

F.2 Quiz Number 2

151

F.2

Quiz Number 2
First Name:

Quiz #2, EE 387, Time: 15 min, Last Name:

1. [2 marks] Events A and B are independent. Also P [A] = 0.2 and P [B] = 0.4. a) [1 mark] Find P [A B c ].

Solution: P [A B c ] = P [A] + P [B c ] P [A B c ] = 0.2 + (1 0.4) 0.2(1 0.4) = 0.68.

b) [1 mark] Find P [Ac |B].

Solution: P [Ac |B] = P [Ac ] = 0.8.

2. [2 marks] A student goes to EE387 class on a snowy day with probability 0.4, but on a non-snowy day attends with probability 0.7. Suppose that 20% of the days in March are snowy in Edmonton. What is the probability that it snowed on March 10 given that the student was in class on that day? Solution: P [C|S] = 0.4, P [C|S c ] = 0.7, P [S] = 0.2 P [C] = P [C|S]P [S] + P [C|S c ]P [S c ] = 0.4 0.2 + 0.7 0.8 = 0.64 P [C|S]P [S] 0.4 0.2 = = 0.125. P [C] 0.64

P [S|C] =

152

2011 Quizzes: Solutions


( )
n k

3. [6 marks] For this question, you may leave your answers as ratios of

terms.

From a class of 20 boys and 10 girls a team of 5 is randomly selected for a competition. a) [2 marks] Find the probability that the team consists of at least one boy and one girl. Solution:
( )
20 5

( )
30 5

( )
10 5

b) [2 marks] Find the probability that the team has an odd number of girls. Solution:
( )( )
10 1 20 4

( )( )
10 2 20 3

( )
30 5

( )( )
10 3 20 2

( )( )
10 4 20 1

c) [2 marks] Three students in this class cannot attend the competition. Find the probability that the randomly chosen team has none of these 3 students.

Solution:
( ) ( ).
27 5 30 5

F.3 Quiz Number 3

153

F.3

Quiz Number 3
First Name:

Quiz #3, EE 387, Time: 15 min, Last Name:

1. A discrete random variable X has the following probability mass function (PMF) A x = 2, 1 PX (x) = A/2 x = 0, 1 0 otherwise. (a) (1 mark) Find A. Solution: A 1 2A+2( )=1A= . 2 3

(b) (1 mark) Sketch the PMF. Solution:


1/3 1/3 1/6 1/6

-2

-1

(c) (1 marks) Find P [0.5 < X 3]. Solution: 1 P [0.5 < X 3] = P [X = 1] = . 6

1 2. (3 marks) The PMF of random variable X is given by PX (x) = 3 for x = 1, 2, 3 and PX (x) = 0 for all other x. Derive the CDF FX (a) = P [X a] for all values of a. Sketch the CDF.

Solution:

154
1 2/3 1/3

2011 Quizzes: Solutions

3. (4 marks) Mary writes a 20 multiple-choice examination on Chemistry 101. Each question has 5 answers. Because she has skipped many lectures, she must take random guesses. Suppose X is the number of questions that she gets right. a) (2 marks) Write down the PMF of X. Solution: 20 1 i 4 (20i) . PX (i) = ( )( ) i 5 5
( )

b) (1 mark) What is the probability that she gets 19 or more questions right? Solution:
(

20 1 19 4 20 1 20 4 0 81 ( ) ( )+ ( ) ( ) = 20 . 19 5 5 20 5 5 5

c) (1 mark) What is the probability that she gets no questions right? Solution:
(

4 20 1 0 4 20 ( ) ( ) = ( )20 . 5 0 5 5

F.4 Quiz Number 5

155

F.4

Quiz Number 5
First Name:

Quiz #5, EE 387, Time: 15 min, Last Name:

1. The wealth of an individual in a country is related to the continuous random variable X, which has the following cumulative distribution function (CDF)

FX (x) =

1 0

1 1x< x4 x 1.

(a) (2 marks) Derive the PDF of X. 4 dFX (x) 1x< 5 fX (x) = = x dx 0 x 1. (b) (2 marks) Find the mean of X, E[X].

E[X] =

xfX (x)dx =
4 1

4 dx x5

= 4 = 3

4 4x dx = x3 | 1 3

(c) (2 marks) Find the mean square of X, E[X 2 ]. Find the variance of X.

E[X ] =

x fX (x)dx =

4x3 dx

= 2x2 | 1 =2 4 2 VAR[X] = E[X 2 ] E 2 [X] = 2 ( ) 3 2 = 9 (d) (2 marks) Suppose the wealth measured in dollars is given by Y = 10000X + 2000. Find the mean wealth E[Y ] and standard deviation STD[Y ].

156

2011 Quizzes: Solutions

E[Y ] = E[10000X + 2000] = 10000E[X] + 2000 = 40000/3 + 2000 2 VAR[Y] = 100002 VAR[X] = ( )108 9 STD[Y] = 10000STD[X] = VAR[Y] = 4714 (e) (2 marks) What is the percentage of the individuals whose income exceed 22000$?

P [Y > 22000] = P [10000X + 2000 > 22000] = P [X > 2] = 1 P [X < 2] = 1 FX (2) = 1 (1 ( = 1/16 = 6.25% 1 )) 16

F.5 Quiz Number 6

157

F.5

Quiz Number 6
First Name:

Quiz #6, EE 387, Time: 15 min, Last Name:

1. X is a Gaussian random variable with = 10, 2 = 4. Answer the following (leave your answers in () form with positive argument). (a) [2 marks] Find P [12 < X < 16]. Solution: X 10 16 10 12 10 < 3] = ( ) ( ) = (3) (1). 2 2 2

P [1 <

(b) [1 mark] Find P [X > 0]. Solution: X 10 5 10 X 10 > ] = 1 P[ < 5] = 1 (5) = 1 (1 (5)) = (5). 2 2 2

P[

2. The time T in minutes between two successive bus arrivals in a bus stop is Exponential ( 0.2). (a) [ 1 mark] When you just arrive at the bus stop, what is the probability that you have to wait for more than 5 minutes? Solution: P [T > 5] = e5 = e0.25 = e1 .

(b) [1 mark] What is the average value of your waiting time? Solution: 1 = 5.

E[T ] =

158

2011 Quizzes: Solutions

(c) [ 1 mark] You are waiting for a bus, and no bus has arrived in the past 2 minutes. You decide to go to the adjacent coee shop to grab a coee. It takes you 5 minutes to grab your coee and be back at the bus station. Determine the probability that you will not miss the bus. Solution: P [T > 7|T > 2] = P [T > 5] = e1 . 3. [ 4 marks] You borrow your friends car to drive to Hinton to see your signicant other. The driving distance is 100 km. The gas gauge is broken, so you dont know how much gas is in the car. The tank holds 40 liters and the car gets 15 km per liter, so you decided to take a chance. (a) [2 marks] Suppose X is the distance (km) that you can drive until the car runs out of gas. Out of Uniform, Exponential and Gaussian PDFs, which one is most suitable for modeling X? Briey justify your choice. Use your choice with the appropriate parameters to answer the following questions. Solution: First note that our random variable is limited and should have zero probability for values larger than 600. In addition there is no information about the value of available gas then every value between 0 and 600 should have the same probability then, X Uniform(0, 600).

(b) [ 1 mark] What is the probability that you make it to Hinton without running out of gas? Solution: P [X > 100] = 1 P [X < 100] = 1 5 100 0 = . 600 6

(c) [1 mark] If you dont run out of gas on the way, what is the probability that you will not run out of gas on the way back if you decide to a take chance again?

F.5 Quiz Number 6 Solution: P [(X > 200) (X > 100)] P [(X > 200)] 4 = = . P [X > 100] P [X > 100] 5

159

P [X > 200|X > 100] =