Вы находитесь на странице: 1из 200
SPRINGER TEXTS IN STATISTICS JIM PITMAN PROBABILITY Instructor’s Manual Springer-Verlag New York Berlin Heidelberg © 1993 Springer-Verlag New York, Inc. All rights reserved. This work may not be translated or copied in whole or in part without the written permission of the Publisher: Springer-Verlag New York, Inc., 175 Fifth Avenue, New York, NY 10010, USA.-Use in connection with any form of information storage and retrieval, electronic adaption, computer software, or by similar or dissimilar methodology now known or hereafter is forbidden. H ©1993 Springer-Verlag New York. Inc. Section 1.1 1. a) 2/3 b) 65.67% ) 0.6667 4) 4/7) 87.14% f) 0.5714 2. a) 7/30 4/10 = 04 3. a) Ifthe tickets are drawn with replacement, then, as in Example 3. there are n? equally likely outcomes, There is just one pair in which the frst number is 1 and the second number is 2, 50 P(fist ticket is 1 and second ticket is 2) = 1/n?. 'b) The event (the numbers on the two tickets are consecutive integers) consists of n — 1 outcomes: (1.2), @.3), «+ (1 1.n). So its probability is (n —1)/n°, ©) Same as Problem 3 of Example 3. Answer: (1 ~ 1/n)/2 7 b)4/10=0. 4) If the draws are made without replacement, then there ate only n? ~ m equally likely pos ble outcomes, since we have to exclude the outcomes (1,1), (2,2). .-» (non). So replace the denominators in a) through ¢) by n(n — 1) 4 a) 238 b) 1 - (2/38) = 36/38 6) 1, since P(both win) 5. a) #{total) = 52 x 51 = 2652 possibilities. 2b) list card ace) = 4 x 51 = 204, Thus (first card ace) = 4/52 = 1/13 €) This will be exactly the same calculation as in part b) — just substitute “second” for “frst®. ‘Thos, P(second card ace) = 1/13. Because you have all possible ordered pairs of cards. any probability statement concerning the fist card by itself must also be true for the second card by itself, 4) P(both aces) (both aces} tots ¢) Plat least one ace) = P(first card ace)4+P(second card ace)~P(both cards aces) = + — a 6. a) 82x52= 2704 b) (52 x 4)/(82 x 82) = 1/13 ¢) Same as b) d) (4 x 4)/(62 x 52) = 1/169 @) Plat least one ace) = P(first card ace)+P(second card ace)~ P(both cards aces) P(both dice < 2) = 4/36 = 1/9 b) P(maximum < 3) = P(both dice < 3) = 9/36 = 1/4 ©) P(maximum P(maximum < 3) — P(maximum e el ie een ) Probability 1/36 | 3736 | 3736 | 7736 | 9/36 | 11/36 €) Since this covers all the possible outcomes of the experiment, and these events are mutually exclusive, you should expect P(1) + P(2) + P(3) + P(A) + P(S) + P(6) = 1 a) Plmaximum < 2) 8, acd) Asin Example 3, the outcome space consists of n? equally likely pairs of numbers, each number between 1 and n. The event (the maximum of the two numbers is less than or equal to z) is represented by the set of pairs having both entries less than or equal to z. There are =? possible pairs of this type, so for ¢ = 0 ton: P(maximum $B. So, no matter w! horse wins (and regardless of any probabilities), you get back more than you bet. A golden ‘opportunity rarely found at the races! 8) Overall gain = (88) x 10 ~ ($1) x 90 = ~$10. 'b) Average gain per bet = (overall gain) / (# bets) c) Gambler's average gain per bet = (overall gxin) / (if bets) = fae X (wins) ~ 1 x (flosses) _ rpay — ‘#wins losses Tre” ‘The house's average financial gain per bet is therefore (rg ~ rpay)/(r@ + 1) over this sequence of bets. Recall that if the fair (chance) odds against the gambler winning are rjair to 1, then the house percentage is (rjair~Fpay)/(P4air +1) x 100%. As the number of bets made increases, we should expect (according to the feequency interpretation of probability) r@ to approach rjair, and therefore the house's average financial gain per bet should approach the house percentage, So we can interpret the house percentage as the long run average financial gain on a one dollar bet Section 1.3 Section 5 10, 1.3 ‘The cake is (2 x2) +2-+1= thmes as large as the piece your neighbor gets, so you get 4/7 of the cake, a) (AB*)U(A*B) ») ‘BSC ) Exactly one : (ABSC*)U(A®BO*)U(ASB°C) Exactly two : (ABO") U(AB‘C)U(A°BC) ‘Three: ABC Take © = (1,2, 800}. 3) (07,93,202) ) (07,93, 202, 4,101, 102, 396)* ©) {16,18, 92, 94,201,203) a) Yes: (0,1) b) Yes: (1) ©) No, This is a subset of the event {1) but itis not identical to (1) because the event (first toss tails, second toss heads) also is a subset of (2). a) Yes: (1,2) 2) first coin lands heads ») second coin lands tals ©) same as.) 4) at least two heads ©) exactly two tails 1) first two coins land the same way. outcome | 1 2 4 6 °) “probability [17101753710 —1/8 1710 1/10 by outcome | 123 ) probability [P3/8 175 17s a) As in Example 3, using the addition rulg and the symmetry assumption, P(2) = P(6) = p/2 and P(2) = P(3) = P(A) = POS) = (1 pia b) Use the additivity property: P(8 or 4 or 5 of 6) = =A) 4 = See a) P(AUB) = P(A) + P(B) ~ P(AB) = 0.6 +0402 =08 b) P(A) =1- P(A) = 04 ©) Similarly P(B*) = 0. 4) P(ASB) = P(B) ~ P(AB) =04~0.2=02 «) PAU BS) 1) P(ASBS .a)09 I cor a) P(exactly 2 of A, B, C) = P(ABC") + P(AB‘C) + P(ASBC) = (P(AB) ~ P(ABC)) + (P(AC) ~ P(ABC)) + (P(BC) — P(ABC)) = P(AB) + P(AC) + P(BC) ~ aP(ABC) 4 Section 1.3 b) P(exactly 1 of A, B, C) = P(AB‘C*) + P(A°BC) + P(ASB°C) = P(A) ~ P(An(BUC)) + P(B) - P(BA(AUC)) + P(C) — P(C(AUB)) P(A) + P(B) + P(C) ~ 2(P(AB) + P(AC) + P(BC)) +3P(ABC) ©) Plexactly none) = 1 = P(AU BUC) — (P(A) + P(B) + P(C)) + (P(AB) + P(AC) + P(BC)) ~ P(ABC) 11. By inclusion-exclusion for n P(AUBUC) = P((AUB)UC] = P(AUB) + P(C) ~ PI(AV B)C] = P(A) + P(B) — P(AB) + P(C) — P{(AU B)C}. Now (AUB)C = (AC)U(BC), which is clear from a Venn diagram. So by another application of inclusion-exelusion for n= P((AU B)C] = PAC) U(BC)] = P(AC) + P(BC) - P(AC)(BC)) = P(AC) + P(BC) ~ P(ABC) Substitute this into the previous expression. 12. Formula has been proved for n = 1,2,3. Proceed by mathematical induction, We should first prove by another mathematical induction using the m = 2 case: (AU B)C = hat our hypothesis is true for n. Then Ai) = PUPA) U Ants] = P(Uker Ad) + PlAnes) = Pia (AsAnss)] (by inclusion-exclusion for n = 2) = Dray aA + SL PUALASAR) —-- (Sen icheten 4P(Ang1) — Yo P(AtAnst) + P(AtAGAng) ~ ++ ‘m sn (by applying the induction hypothesis to the frst and third term) PIA)- P(A‘) + YO P(AiAsAL) - icjgnen sesceengs (by regrouping terms.) So the claim holds for n + 1 13, For m = 2, P(A: UAs) = P(As) + P(da) ~ PCAs 9 Aa) < PCAs) + PCAs) Now use induction by ssruming true for m P (u a)=r ( 14. Use he ideniy P(AU 15. Let A= BE. Then Us] Aes)

1 and sets As, S(m; Ai An) SPA) = FE Pla) ee Play Ana) An are n sets then (-1)" (PULTAR) = Som; Ans es An)) 2 0 ‘Use mathematical induction, The claim holds for m = 1 (by a)) (and also for m by b)). Say the claim holds for m. Now ehow that this implies the claim holds for m +1, ie., that: Subelai (m is fixed) For each m > 1: If At,.++,Am are m sets then (ay (PUTA) ~ S(m 415 An, An) 2 0. Proof of subelaim: Again use mathematical induction. This subclaim holds for n = 1,....m because the inclusion-exclusion formula (Exercise 12) shows that the left side equals zero. Sup- ppose the claim holds for n. Now show that this implies the subelaim holds for n+ 1, ive., that (Cy (PEUT# Ad — Sm + 154s ‘To do this, fist observe that inta)) 20. (Oo) Som 41, Atyoos An) # PlAnas) = Slo; Ay Ants sees AnAnss) = S(m 41d Ansa): ‘Therefore (1! (P(UZt Aa) — Sn 415 Ans... Antt)) (=1)™4" (PUPAL) + PlAnsi) = P(UPAAnds) — Sm 415 Atseos Ana) (by inclusion exclusion for 2 sets) = (0 (PUTA) = PUPA Angi) + Sm Arne (by identity (64)) my" (PUPAL) = S(m $1; Ai. An) #(=1)" (PUUTALA mgt) = Sm ArAnsiyees Annes) ‘The first term is nonnegative by the induction hypothesis on n; the second term is nonnegative by the induction hypothesis on m. So (+) holds, and we'te done! vAnAngs) = Stim +1 Atye-- An) Section 1.4 Section 1.4 1. Let x denote the proportion of women in the population. Then the proportion r of righthanders is given by 1 = 92x x +.88 x (I~ x) = 88+ 04x. » 6 b) (i) :since 0 < <1 implies .88 91. since * is unknown, 2. Pick a light bulb at random. Let D be the event (the light bulb is defective), and be the event (the light bulb is made in city B). Then P(B) =1/3, P(D|B) =.01, and P(D‘B) = P(D‘|B)P(B) = (1 — P(D|B)|P(B) = 0.99 x 1/3 = 0.33 = Plesin today and tomorrow) _ 30% _ 3 = Gain today) = 0% = 4 3. P(rain tomorrow | rain today) 4. Call the events A (having probability 1) and B (having probability .3). a) P(ASBS) = P(AS)P(BS) = 0.9 x 0.7 = 0.63 b) 1- P(ASBS) = 0.87 ©) P(ABSU AB) = P(ABS) + P(ASB) = 0.1 x 0.7 40.9 x 0.3 = 0.34 5. a) Let Uy = (urn 1 chosen), Up = (urn 2 chosen), B = (black ball chosen), W = (white ball chosen). 2 B i 3 2 S > 4 ; 7 Uy z w 7 b) PW) (Ua): POWs) fs POIs) = 3: P(BIU2) = § ) P(B) = P(BUs) + P(BU2) = P(BIUs) P(Us) + P(BIU2)P(U2) F3xE 6. P(second spade | first black) = Pifitst black and second spade = rat P(fist spade and second spade) + P(first club and second spade] ‘Git spade and second spots) pig clab and second spade) J = (24/52)(12/51) + (13/52)(13/51) "28. (26/32) = 0 Or you may use a symmetry argument a fllows: Plsecond black | fist black) = 25/51, and P(second spade | frst black) = P(second club | first black) by symmetry. Therefore P(second spade | first black) = 25/102. Discussion. The frequency interpretation is that over the long ran, out of every 102 deals yielding 4 black card fest, about 25 will Section 1.4 7. a) P(B)=03 b) P(AU B) = P(A) + PCB) ~ PCA) P(B) aioe) = avant 8. Assume n cards and all 2n faces are equally Ukely to show on top. P(white on bottom | black on top) = Plwhite on bottom and black on top) __somxd 7 PE Se = otis pao = 5/9 9, By scheme A, P(student selected is from School 1) = 100/100 = 1/10, while by scheme B, that cchance is just 1/3, which is the chazce that school 1 is selected. So these two schemes are not probabilistically equivalent. Consider a particular student z, and suppose she is from school i. The chance that x will be selected by scheme A is 1/1000. The chance that z will be selected by scheme C is P(z is selected) = P(School i is selected, and then 2) Ge ot cas For scheme C to be equivalent to scheme A, this chance should be 1/1000. Therefore, Pi = (size of class 1)/1000. So pr = 1, pp = 4, pa = 5. 10, a) Let A; be the event that the ith source works. (aero work) = P(AEAS) = 0.6 x 05 = 0.3 Plexactly one works) = P(A:AS) + P(AGAz P(both work) = P(A: Az) = 0.4 x 0.5 = 0.2 b) P(enough power) = 0.6 x 0.5 +1x0.2=0.5 04 x 0.5 40.6 x0 os 11, Let By = (firstborn is a boy), Gr = (firstborn is a girl), similarly for By and Go. a) P(B, Bz) = P(B: Bs | identical) P(identical) + P(B, Ba | fraternal) P( fraternal) b) P(BG2) = 0-9-4 4 -(1—p) = GE. ©) Note that the chance that the firstborn is a boy is 1/2 whether identical or fraternal, so P(Bi) = 1B: Simiaaly PUB) = PUG) = Gs) = 172 So ay = 24eaGe) — Wes) 0 . Prenton Pies = ASD 12. PFS Section 1.5 Section 1.5 1. a) P(black) = P(black | odd) P(odd) + P(black | even) Pleven) = } ») Pleven | white) = PUitefeven) Pleven). _ avsin THe 2 1 y 3 W, 4 4 6 10 a 7 4 ST 3 By > B, 3 8) (Wa) = PW. W2) + P(B1Ws) = b) P(Bi|W2) = Peete) = Set «) Repeat a), with symbols: PW2) = oi: ofa + che ata = ots 3. Pick at chip at random. Let B = (chip is bad), let P(AIB) =: 1. and P(AIB*) = 1, which imply PCB’ (chip passes the cheap test). Then P(B) 8, and P(A‘|B) =.9. A Pass) 02 1 Bias) 2 A‘Gail) a8 8 BGood) A (Pass) 20 Section 1.5 ©) P(ertor in transmission) = P(RaT; U Rs Ts) = P(RoT:) + P(RsTs) (RolT) PCT) + PCRs |To) PCT) 102(.5) + (.01)() = 3/200 4) (.02)(.8) + (.01)(.2) = 9/500 5. Let D denote the event (person has the disease), + denote the event (person is diagnosed as having the disease), and — denote the event (person is diagnosed as healthy). Then P(D} P(-|D) = .2, which in turn imply that P(D*) =.99, P(~[D*) = 95, P(+1D) + 008 8 D 2 ay 2 002 + 0995 a) 0s oO # - 9405 a) P(+) = P(+|D)P(D) + P(+|D)P(D*) b) P(-9D) = P(-1D)P(D) 6) PAD?) = PD} PD 4) P(DI+) = epee Or we may arg sing, odds r Hence P(D|+) €) Yes, See the explanation after Example 3. 6. a) The experimenter is assu ‘That is, prior probabilities are given by PUI) = P(Ef2) = b) No, because the above assui ing that before the experiment, Hy, Hz, and Hy are equally likely. (Hs) = 1/3. 5, P(Ha) = 45, P(Hs) = 06. 1, P(AlHa) = 01, P(A|H3) = 38 [Now all these probabilities have a long run frequency interpretation, so the posterior probabil- ities will as well] Posterior probabilities a hypothesis; Ay is, and PHS1A) no longer the most likely 7. a) Asin Example 1 ‘P(Box i and white) (Box i and black) P(Box i | white) P(Box i | black) 10 Section 1.5 ‘Since P(Box i | white) is largest when i = 3 and P(Box i | black) is largest when { = 1, the strategy is to guess Box 3 when a white ball is drawn and guess Box 1 when a black ball is drawn. The probability of guessing cortectly is P(Box 3 and white) + P(Box 1 and black) = 5/12. b) Suppose your strategy is to guess box i with probability pi (i= 1, 2,3; pit pa + ps = 1) ‘whenever a white ball is drawn, and suppose [am picking each box with probability 1/3. Then in cases where a white ball is drawn, the probability that you guess correctly is 228 43 48 25 a3 P2329" Clearly the probability of your guessing correctly is greatest when ps = 1; that is, when you ‘guess Box 3 every time that you see a white ball. A similar argument for the case where a black ball is drawn shows that the probability of your guessing correctly is greatest when you guess Box 1 every time that you see a black ball. ©) Here P(Box 1) = 1/2, P(Box 2) = 1/4, P(Box 3) = 1/4, s0 that eee) pig teas ts ‘P(Box i and white) (Box i and black) ins (Box i| white) 929 ‘P(Box i | black) aps If you continue to use the strategy found in a), i.e. guess Box 3 if a white ball is seen, and guess Bor 1 if'a black ball is seen, then the probability of guessing correctly is 3,17 P(Box 3 and white) + P(Box 1 and black) = + i=. 4) If you are convinced that I am using either the (1/3, 1/3, 1/3) strategy or the (1/2, 1/4, 1/4) strategy, then you can decide which one it is by observing the long-run proportion of trials that your guesses are correct. ICT am using the (1/3, 1/3, 1/3) strategy then, by the frequency interpretation of probability, the proportion of trials that you guess correctly should bbe approximately 5/12, while if T am using the (1/2, 1/4, 1/4) strategy then this proportion should be closer to 7/16. If I am in fact asing the (1/2, 1/4, 1/4) picking strategy, you can improve upon the guessing strategy found in a) as follows: Since P(Box i | white) is largest when i = 1 and P(Box é | black) is largest when i = 1, the strategy is to always guess Box 1 ‘The probability of guessing correctly is then B(Box 1) = If you're not sure at all what strategy Iam using, you can determine it approximately, as follows: Suppose I am using a (pi, pz, 2) strategy to pick the boxes, and suppose you decide that you will always pick (ay) Box 1 when you see a white ball, Box 2 when you see a black ball. Itis possible for you to determine my strategy, provided you can keep track of the proportion of trials where a white ball was chosen from Box 1 and the proportion of trials where a black ball was chosen from Box 2. (This you can do, because you can see the color of the ball drawn, land you ate told whether your guess is correct.) By the frequency interpretation of probability, these two proportions should in the long run approximate the probabilities Poe tad wie) = py and P(B042 ad ck) = be teapectively. Thus you should be able to determine (approximately) the values pi and p2 (and hence ps). a) The p probabilities (of the boxes that I pick) are as follows: 1 Section 1.5 ») 2 4 ifr 2 os we | 6/23 9/28 8/23 Use Bayes’ rule P(Box i | black) = PUBor and black to determine the posterior probabilities: P(Box i | black) | 3/8 3/8 1/4 So you should guest Box 1 or 2. For either choice, P(you guess correctly | black) = 3/8. Use Bayes’ le P(Box i | white) = PlBox rand white) mrt to determine the posterior probabilities: i 123 P(Box il white) | 1/58 2/5 2/5 [You can use P(white) = 1 — P(black) = 15/23.] So you should guess Box 2, For either choice, P(you guess correctly | white (guess correctly) P(guess correctly | black) P(black) + P(guess coreecly | white) (white) +ixB=s. ‘This is your chance of winning, no matter which of the two best choices you make in each case. For instance, you could simply always guess Box 2, in which cage the event ofa win for you is the same as the event that I pice Box 2. ‘To see why the probability of your guessing correctly is at most 9/23 whatever your strategy may be, suppose that your strategy whenever a blac ball is drawn is to guess Box i with probability Then (guess cortectly | black) = pi P(Box 1 | black) + p:P(Box 2 | black) + ps P(Box 3 | black) btmbtnt=} similarly, whatever your strategy whenever a white ball is drawn, P(guess correctly | white) < 3. Therefore P(guess correctly) = P(guess correctly | black) P( black) + P(guess correctly | white) P(white) SPxR+Ex HS P(guess correctly | Box 1) = P(guess 1 | black & Bor 1)P(black & Box 1 | Box 1) + P(guess 1 | white & Box 1)P(white & Box 1 | Box 1) Bxtsoxias. Similarly P(guess correctly | Box 2) P(guess correctly | Box 3) = 0x t 5/8. 5 and 12 Section 1.6 Section 1.6 1. This is just like the birthday problem: P(2 ot more under same sign) = 1 ~ P(all different signs). uli n7 i 10 1 ai <2 nw, 8 ms 12" 12“ 12 ~ 298 a n 28 Me st n* i" i" we? 2 So, the answer is 5. 2. Assume the successive hits are independent, each occurring with chance 0.3, a) 1- (0.7 = 0.51 b) 1-(0.7)? = 0.687 9 1-(0.7" 2) Plat est two heads | atest one he) = Pest two Hand a ast one 8 Plat teat wo 1) _ sapeaiies? — ay = Pores = EEL GRBPEL = Ju = 7695. b) P(exactly one HJ at least one H) + P(at least two H | at least one H) because (at least one H) = (exactly one H) U (at least two H), and the two events are disjoint. ‘So P(exactly one H | at least one H) = 1 — 7695 = .2305. ae 9/2000 BY (ay x Be % 38) + (as * Ba * ae) + (FB * ae * ae) = 53/8000 €) PGackpot) = 3 x 4 x 3 = 9/8000, same as before; P(vwo bells) = (3 x dy x B) + (3 x xB) +E ‘The chance of the jackpot is the same on both machines, but the 1-9-1 machine encourages you to play, because you have a better chance of two bells. It will seem that you are “close” to a jackpot more frequently 5. a) Plat least one studeat has the same birthday as mine) “'P(all n—1 other students have differen birthdays from mine) = (964/365)"—" [My birthday is one particular day: in order for the event to occur, each ofthe other students must have been born on one of the remaining 364 days .) 1) Use the argument in Example 3. The desired event has probability atleast 1/2 i€ and ony its complement has probability at most 1/2. This occurs i and only if (964/265)"-" < 1/2 «=o Hoe ew n= 1 > ORB 2 253.6 So n = 254 wil do ) The difference between this and the standard bi (say January Ist) must occur twice in the class. Th usual birthday problem. hday problem is that a particular birthday is a much stricter requirement than in the 6 a) pe=pe (Never need to roll more than seven times) mao mals ps = (5/8) (2/6) (2nd different from first, thied the same as frst or second) 13 Section 1.6 Pe = (5/6) -(4/6) - (3/6) s = (5/6) -(4/6) (3/6) -(4/8) Pa = (5/6) « (4/6) - (3/6) - (2/6) (5/6) Pr = (5/6) - (4/6) - (3/6) -(2/6) (1/6) -1 ¥) pte tp = : you must stop before the tenth roll, and the events determining ps, pa, eten are mutually exclusive. €) Of course you can compute them and add them up. Here's another way. In general, let Ai be the event that the first i rolls are different, then pi = P(Ai-1) — P(Ai) for ¢ +7, with P(A1) =1, and P(Ar) = 0. Adding them up, you can easily check that the sum is 1. 2) msl +p pm) = poPim + nm + POPP ) e+ P(flows along top) —p4-P(lows along top) where P(flows along top) was calculated in a) 2) P(Brz and Bas) = P(all three have the same birthdates) = P(Bi2) = Ses ses = sas, = P(B:s)- So P(By2 and Bas) = aye = P(Bi2)P(B2s)- ‘Therefore, Bia and Bay are independent! ¥) No. I you tall me Bz and Bay have occused, then all theee have the same birthday, so Bis ‘leo has occurred. That is, P(B12B2sBs1) = sasr # sds = P(Bi2) P(Bas) PCBs). ©) Yes, each pair is independent by the same reason as a). 14 Chapter 1: Review Chapter 1: Review 1. P(both defective | item picked at random defective) od See = amr pele Bsa toc 2 2 : ° y 9 3 @o e 3 7—e 3 From the above tree Piblack) = P(white) = 1/2, Piblac, black) = P(white, white) = 1/2, Piblack, white) = P(hite, black) = 1/6. So P(white | at least one of two balls drawn was white) = pplSti/tys = 3/4. 3, False, Of course P(HHH or TTT) = 1/4. The problem with the reasoning is that while two of the coins at least must land the same way, which two is not known in advance. Thus given say two or more H's, ie. HHH, HHT, HTH, o: THH, these 4 outcomes axe equally likely, s0 P(HHH ot TTT | at least two Hs) = 1/4, not 1/2. Similarly given at least two T's a) P(black | Box 1) = 3/5 = P(black | Box 2) and P(ced | Box 1) = 2/5 = P(red | Box 2). Hence P(black) = P(black | Box i) and P(red) = P(eed | Box i) for i = 1,2, and so the color of the ball is independent of which box i chosen. Or you can check that P(black, Box 1) = P(black)P(Box 1), ete, from the following table. Box 1 Box 2 black [3/10 3/10 | 3/5 red | os u/s | 4/5 1p ae ») Box 1 Rox 2 black | 3710 s/s | 26745 red_| ys 2/9 _| roses 11/2 Observe that P(black, Box 1) # P(black)P(Box 1), so color of ball is not independent of which box is chosen. 15 Chapter 1: Review ‘5. For either of the permissible orders in which to attempt the tasks, we have {pass test) = SiS: USFS255 where S; denotes the event (task é performed successfully}; you don't have to worry aber ve thied task if you already have performed the first two tasks successfully. For the first order, hard, easy) the probability of passing the test is therefore P(pass test) P(SiS2) + P(S{S2Ss) = zh + (1 ~ 2)h2 = 2h(2~ =), By symmetry, the probability for the second order (hard, easy, hard) is hz(2—h). © the second order maximizes the probability of passing the test. 6. a) Since P(B) h>2-s, (AB) + PAB), P(ASB) = P(B) ~ P(AB) = P(B) ~ P(A)P(B) = P(B)(1 — P(A)) = P(B)PLA‘) 1b) Just reverse the roles of A and B in part a), ©) P(ASBY) = P((AUB)*) =1- P(AUB) =1- P(A) PCB) + PAB) 1 PCA) P(B) + PCAYP(B) (by independence of A and B) (1= PAN) = PCB) = PCA) PBY) 7. a) The probability that there will be no one in favor of 134 is the probability that the Ist person doesn’t favor 134 and the 2ad person doesn’t favor 134 and the 3rd person doesn't favor 134 and the 4th person doesn’t favor 134. This is just 23 x 12 x 38 x Hf = .021 ') The probability that atleast one person favors 134 is 1 the probability that no one favors 134; but the probability that no one favors 134 was done in part a). ‘Thus the answer is 1.021 =.979. €) The probability that exactly one person favors 124 is (!) times the probability that the Ist person {gt nde 2a pce av sad he ep dn andthe th pean doi 4) ‘The probability that a majority favor 134 isthe probability that three people favor 134 plus the probability that four people favor 134, This probability is ($) x 38 x 22 x 26 x 384 32 x 22 x Bx Boar 8. a) 0.08228 b) 0.785) 0.1866 9. a) By independence, P(A and B and C) = P(A)- P(B)- P(C) b) P(A or Bor C) = 1 P(ATB*C®) =1= P(A): PCBS): PC") = (Or use inclusion-exclusion, & = 01666... ©) Plexactly one of the three events occurs) = (ABSC*) + P(ATBCS) + PLA'B‘C) PEPE EE DEE E bee Bo an, 10. a) P(same type) = P(both A oF both B or both AB or both 0) P(both A) + P(both B) + P(both AB) + P(both 0) = (42)? + (10)? + (04)? + (44)? = 3816; Pdiferent types) = | ~ P(same type) = 1 — 816 = 5184 b) Since P(1) + P(2) + P(3) + P(4) = 1, we need only evaluate three of these; the fourth can be ‘obtained by subtraction P(A) = Pall A or all B oF all AB or all 0) P(all A) + P(all B) + P(all AB) + P(all 0) (42)6 + (-10)* + (04)* + (.44)* = 0687 16 Chapter 1: Review (#f arrangements) (42)(.10)(.04)(.44) = (41)(.0007392) = .01774 (.42)?(.10)? + (42)?(.04)? + (.42)?(.44)? + (-10)?(,04)? + (.10)?(.44)? + (.04)?(.44)7] 6 4+ [(42)° = 42) + (10° 10) + (.04)°(1 ~ 04) + (44) — 48)) 304 =.so7a So P(3) = 1 — .5973 — .0687 — 01774 163. n. ir = Pifair and BT) Pier) = Pet _ POET | fait) P(faie) = PORT] fais)P(fair) + P(HT | biased) P(biased) (4/4) x (f/m) of © CAV) + OD) x Ca) OF +85 12, a) If'm D7, the chance is 0: the die has only six different faces! If n = 1, the chance is 1. nntidance=$ é ramen 24 teams B 4. Srehance = 3.4.3.2 ssannce = 8-4 1b) 1 above : the events in a) and b) are complementary. 13. P(AIB) = P(ABIB) = S> PABA) P(AB)/P(B) > P(A|Bi)P(Bi)/ P(B) = So PLAIB) PBIB) since BB = By 14.) Since the prior probabilities of the various boxes are the same, choose the one with the greatest likelihood: Box 100. b) Here are the prior probabilities and likelihoods given a gold coin is drawn: Box 1 Box? Box3 ... Box 98 Box 99 Bor 100 rior: 2/150 1/150 2/150. 1/50 2/180 1/150 likelihoods: 1/100 2/100 3/100 98/100 99/100 100/100 Clearly, the product (prior odds x likelihoods) is greatest for Box 99. So Box 99 is the choice with the greatest posterior probability given a gold coin is drawn 16. P(Box 1| gold) = pyabr7s =F 17 Chapter 1: Review 16. a) Student 1 can choose any of the remaining n ~ 1, student 2 can choose any of the eligible n — 2, student 3 can choose any of the eligible n ~2 except student 1, student 4 can choose any of the le n~2 other than students 1 and 2, and so on. So aD (9-2), (n= 3) (nd) (9 P= Gant) * (n=) * Gad) * (RD) @ay ») Use the exponential approximation: log re) = since log(1 +2) = = for small = 1.362416 17. False. They presumably send the same letter to 10,000 people, including the name of the winner, the ame of the person to whom the letter is sent, and the name of one other person. In that case, I learn nothing from the information given, so my chances of winning the Datsun are still 1 in 10,000. 18 (©1993 Springer-Verlag New York, Inc. + Section 2.1 eas b) CNA S/5" 2. P(2 boys and 2 girls) = ($)(1/2)* = 6/2¢ = 0.375 < 0.5. So families with different numbers of boys and girls are more Likely than those having an equal number of boys and giels, and the relative frequencies are (respectively): 0.625, 0.975. 3. a) P(2 sixes in § rolls) = (°)(1/6)?(5/6)° = 0.160751 1b) Plat least 2 sixes in $ rolls) = 1 ~ P(at most 1 six) ~ [P(0 sixes) + PCI six) — (8/6) (1/6)(5/6)* = 0.196245 (This is shorter than adding the chances of 2,3, 4, 5 sixes.) 2) Pl mean = Pla ae PU a) PO Ho = (8/6)° + ($)(1/6)(5/6)* + (3)(1/6)7(5/8)° = 0.964506, 4) The probability that a single die shows 4 or greater is 3/6 = 1/2. Plexactly 3 show 4 of greater) = (2)(1/2)* = 0.3125 ©) Plat least 3 show 4 or greater) = {(3) + (2) + @)] (2° = 05 (The binomial (5, 1/2) distribution is symmetric about 2.5) PCO sixes in fist five rolls sixes in all eight rolls) “_ P(2 sixes in first 5, and 3 sixes in all eight) = Paces in all eight) __P(2 sixes in frat five, and 1 six - PCG cixes in all eight) = GGIK (8/5? -G)AL8)8/8)* CATER E/6 (:)(0.7)*(0.3)* = 0.1361367 1) Plenactly 4 hiteat least 2 hits) = MSZ 4 hit tat least 2 hits = prexactly 4 hits) z = SReRReTy 0 bateyrcexaclly Thay = 01963126 ) Plexactly 4 hits|firat 2 shots bit) exactly 4 hits & frst 2 shots hit = Meme Dskoe By = Pifrst 2 Shots hit & exactly 2 hits in last 6 shots) = Rint Date iets Be = P(exactly 2 hits in last 6 shots) (since shots are independent) = (£)(0-77"(0.3)* = 0.059835 7. The chance that you win in this game is 15/36 = 5/12 (list all 36 outcomes!), so the chance you win atleast four times in five plays is next three sasti4 © “eB 6. a) Plexactly 4 hits) = (<)enarer + (Sonar aa ; : Inp+p <1 then the most likely number of successes is int (np +p! Section 2.1 Imp -+p = 1 then the most likly number of successes is 0 or 1 (both equally likely). # Ifmp-+p > 1 then the most likely number of successes is at least 1. Hence the most likely numberof successes is aero if and only ifmp+-p < 1, and the largest for which zero is the most likely number of successes is 9. a) Most likely number = 104040. 22 ao +11 y= 198724 b) P(10) = P(e)- AB. 198724 BEG FF SBESHOL = 107847 <¢) P(10 wins in 326 bets) n = P(9 wins in 325 bets) 3 + P(UO wins in 925 bets) x ST a = 132058 x Jp + nated x FZ = 1132919 10. a) H/(n 1) b) (m— E+ I M(n+1). 11, a) The tallest bar in the binomial (15, 0.7) histogram is at int ((n + 1)p) = int(16x0.7) 0 b) P(11 adults in sample of 15) = '8)(0.7)(0.3)* ts ye 154 x13 x12 11(0.3)¢ (SJonon' ALE ons 12, a) P(makes exactly 8 bets before stopping) ‘Plwins at 8th game, and has won 4 out of the previous seven) = (asyae cops (18/38) = 0.1216891 b) P(plays at least 9 times) “P(wins at most four bets out of the first eight) (20/38)*-+(*)(18/38)(20/38)" + (2) (18/38)? (20/38)* + (8)(18/38)°(20/38)* + ()(18/38)*(20/38)* 0.696167 218623 18, a) No! Since the child must get one allele from his/her mother, which will be the dominant B, he/she must have brown eyes. b) Taking one allele from each parent, we see that there are four (equally likely) allele pairs: Bb, Bb, bb, bb. Thus the probability of the child having brown eyes is 50% = 0.5. €) Once again there are four possibilities: Bb, Bb, Bb, bb. Thus there ia a 75% = 0.75 chance that the child will have brown eyes. 4) Given that the mother is brown eyed and her parents were both Bb, the probability she is Bb is 2/3 and BB 1/3. Thos P(child brown-eyed) = P(child brown-eyed|mother Bb) P(mother Bb) +P (child brown-eyed|mother BB) - P(mother BB) 12 12 aXgtixgag So P(child Bb|woman Bb) P(woman Bb) _ (1/2)(2/3) _ P (Child Bb) as 2 20 Section 2.1 14, a) (Ts, Pw); tall and purple. Genetic Combinations | Probability (FT, PP) tall and purple (TT, Pw) tall and purple (17, we) tall and white 5 (Ts, PP) tall and purple (Ts, Pw) tall and purple (is, wee) fall and white (6, PP) short and parple (@, Pw) short and parple (6s, ww) short and white tall and purple | tall and white | short and purple and white Probability 96 36 ane 1/16 ©) 1~ (7/16) ~ r0(9/r8y(7/6)* 15.8) HO-< p<, then int (np+p) = np, since ap is an integer. b) Note that np = int (np) + [np — int (np). If {np — int (np)] + p > 1, then int (np-+ p) = int (np) +1, which is the integer above np. 1€0 < {np ~ int (np)] +p <1, then int (np +p) = int (np), which is the integer below np. ©) Consider the case n = 2,p = 1/3, where 1 is the closest integer to mp and the integer above np, ‘but Os also a mode. Also O is the integer below np, but 1 is also a mode. 21 Section 2.2 Section 2.2 1. ‘The number of heads in 400 tosses has binomial (400, 0.5) distribution. Use the normal approximation with je = 400 x (1/2) = 200 and o a) P(190 < H < 210) = 8(1.05) — 6(-1.05) b) P(210 < H < 220) = (2.05) — (0.95) €) P(H = 200) x (0.08) ~ (0.08) = 0.0398 4) PUL = 210) = (1.05) ~ (0.95) 04 and « = 9.998. 2. Now a) P(190 < H < 210) = 8(0.65) — @(-1.48) = 0.6686 b) P(210 < H < 220) = (1.65) — (0.55) = 0.2417 ) PUL = 200) = (—0.35) — (0.45) = 0.0368 4) P(H = 210) = (0.65) ~ £(0.55) = 0.0333 3. a) Law of large numbers: the frst one. ) Binomial (100, .5) mean 50, SD 5: 43-90 1-9 (#5=5) = 0(.9) = 1— 8159 = 1841 Binomial (40,8) mean 20, 8D 10: 210 — 200 - - = a1 = 0286 9 (224228) 1 oa) = 4. Let X be the number of patients helped by the treatment. Then E(X) = 100, SD(X) = 8.16 and P(X > 120) = P(X > 120.5) = 1 ~ 6(2.51) = 008. . Want the chance that you win at least 13 times. The number of times that you win has binomial (25, 18/38) distribution. Use the normal approximation with w = 11.84,0 = 2.50: P(13 oF more wins) = 1 — P(12 ot fewer wins) ~ 1 (25-5284) — 1 — $(0,26) = 0.2974 G. The number of opposing voters in the sample has the binomial (200, 45) distribution. This gives 4 = 90 and ¢ = 200 x45 x58 = 7.035. Use the normal approximation: ) The required chance is approximately ° Crm) -* is b) Now the required chance is approximately 1-9 (Pras) = = (1.49) = 1 ~ 9319 = 0.0681 (about 7%) 7. a) The city A sample has 400 people, the city B sample has 600 people, so city B accuracy is /ofa = 1.22 times greater. bb) Hoth have the same accuracy, since the absolute sizes of the two samples are equal. ©) The city A sample has 4000 people, the city B sample has 4500 people, so the city B sample is /4500/4000 = 1.06 times more accurate, even though the percent of population sampled in city B is smaller than the percent sampled in city A. 8. Use the normal approximation: = YApy = 9.1287. Want relative area between 99.5 and 100.5 under the normal curve with mean 600 x (1/6) = 100, and ¢ = 9.1287, That is, want area between =.055 and .0S5 under the standard normal curve, Required area = 4(.085) — [1 ~ 6(.055)] = 0438 22 9, 10. We have 30 independent repetitions of a binomial (200, nu. 12. 1. Section 2.2 3) Think of 324 independent trials, each a “success” (the person shows up) with probability 0.9. So the number of arrivals has binomi Use the normal approximation with 1 compute the required probability: 16 (25-2) = 1 — (1.65) = 0.0495. ') Increase: in the long run, each group must show up with probability .9. So effectively, traveling in groups reduces the number of trials, keeping p the same. So the histogram for the proportion ‘of successes has more mass in the tails, since n is smaller. €) Repeat a), with the 300 seats replaced by 150 pairs, and the 324 people replaced by 162 pairs. So the number of pairs that arrive has binomial (162, 0.9) distribution. Use the nor- mal approximation with = 162 x 0.9 = 145.8 and o = VISIXODXOL = 3.82, We want 16 (2$5188) — 1 — (1.29) = 0.1099. 24 x 0.9 91.6 and o = VIE XOTXOI = 5.4 to 4). For each of these repetitions, the proba- Dility of getting exactly 100 heads can be gotten by normal approximation where 4 = (200 } and = 200 x} xP =7.07. ‘The chance of exactly 100 heads can be approximated by ‘P(100 heads ) = ® (yg *) 9a 208) a nea — aris = 0364 Since the students are independent, the probability that all 30 students do not get exactly 100 heads is (1 P(100 heads ))# and the normal approximation gives us (1 = P(100 heads ))*? = (1 — .0564)"° = 0.175 ‘The number of hits in the next 100 times at bat has binomial (100, 0.3) distribution. Use the normal approximation with « = 30 and o = VI00 x03 x0. = 4.58. . — (0.11) = 0.4562 b) P(E 33 hits) = 1 — 6 (228522) = 1 - 0(0.545) = 0.2929, 2) PCS 21 bite) = & (213582) = 6(—0.545) = 0.2929 4) No, independence would be lost, because ifthe player has been doing well the lst few times at bat, he's more likely to do well the next time. Similarly, if he’s been doing badly, then he’s more likely to continue doing badly. This will increase all the chances above. ©) From part b), we see that the player has about a 29% chance of such a performance, if his form stayed the same (long run average .300) and if the hits were independent. So itis reasonable to conclude that this performance is “due to chance,” since 29% is quite a large probability a) PCS 31 hits) a1 — @ (288522 For n = 10,000 independent triale with auccess probability p = 1/2, = np = 5000 and ¢ = py = 50 Hence, by the normal approximation, the probability of between 5000 — m and S000 +m successes is approximately mai =m=1/2) 9g (m+in #(24) -(434) 7° (4) ~ ‘This equals 2/3 if and only if ©((m + 1/2)/50) = 5/6 which implies that (m+ 1/2)/50 = 0.97 and m 48. In other words, there is about a 2/3 chance that the number of heads in 10,000 toczes ofa fair coin is within about one standard deviation of the mean. 15%. This means (2) First find z such that 4(—2,2) 0.9750 and by the normal table, + = 1.96. 95% = P (pin pt 1.96,/B) < m (pin parse) We want 1.96(.5/Y%) < 1%, $0 1.96(.5) a2 ( at ) 23 1604 (224, 0.8) distribution. We want P(more than 200 successes in 324 triak Section 2.2 14. 15. 16. a a) # working devices in box has binomial (400,.95) distribution. This is approximately normal with = 380, 0 4.36. Required percent = 1 ~ (2255382) = 1 9(2.18) = 0.0146 (This normal approximation is pretty rough due to the skewness of the distribution. The exact probability js 0.0094 correct to 4 decimal places. The skew-normal approximation is 0.0099 which is much better.) ) Using the normal approxi jon, want largest £0 that 1 — (4=23332) > 0,95 20 * =185, 20 b= 973 3) 6) = glade? = -286) by 9"(2) = -6(2) — H(-24(2)) = (2? — DAC) ) Sketch: Outside (4,4), they are close to ero. 4) Let f(2) = b6(EG#). Want J") = 19% 1 ©) For n—a q +o or z Iv after thi P(&) wil decrease, ‘Thus the maximum occurs at int). There fs & double maximum if and ony if R(E) = 1 for some. This can ony occur if pi an integer. ‘The two Nalier of F that maximise ae chen w and je. ‘There can never be 8 txiple masimem since thi would imply that R(k) = 1 and R(k—1) = 1 for some k. 9. Assume that each box of cereal has a prize with chance 95, independently of all others. The number of prizes collected by the family in 52 weeks has the binomial distribution with parameters 52 and (95. This gives 4 = 52 x .95 = 49.4, and @ = VSEX.SXUG = 1.57. Since o is very small (les than 2), the normal approximation is not good. Use the Poisson instead. ‘The number of “dud” boxes has the Poisson distribution with paramter 52 x.05 = 2.6. We want the chance of 46 or more prizes, that is, 6 or less duds. This is (1 42.6 + 2.67/2 + 2.67/31 + 2.6/4! 42.64/51 + 2.68 61} = 982830. [For comparison, the exact binomial probability is 985515. The normal approximation gives .993459,] 10, Distribution of the number of successes is binomial (n,1/N) = Poisson (n/N) = Poisson (5/3). P(atleast two) = 1 — P(0) - P(I) x1-e*9(1 45/3) $19 § 2. 0.49633 0.5 28 Section 2.5 Section 2.5 1. 9) QED ay Ceycarsytarsy* 2a) BoB b)3xa) ci 3. 4) HYD») QED GR, nes (OE) Cope) eR ao 4, The exact chance is For the approximation, use the normal curve with # = 40, 0 = 4.9. Chance is approximately 1 6(3582) 0.1788 2 Solve eet < enn > 537. 5. Solve TES < —2.326, then n > 597 a9 ® 0 B-B ows. 7. Denote by B; the event (jth ball is black), similarly for Ri. 3) P(BB: Bs By) = P(Bs)P(Ba|Bs)P(Bs1Bs Ba) P( BBs Ba Bs 'b) This is four times P(B; Bz Bsa), s0 by d) equals .3716. ©) P(B:BsBaRs) = P(Bs)P(Bs| Bs) P(Ba|Bi Ba)P(Ral Bs Ba Ba) = SALE = 0929. 8. Let the outcome space be the set of all element subsets from the set {1,100}, $0 that, eg. the ‘S-set (23, 21, 1) means that the winning tickets were tickets #1, #21, and #23. Each of the ('3°) such Ssets is equally likely. '4) The event (one person gets all three wining tickets) is the disjoint union of the 10 equally likely events (person i gets all three tickets). The event (person i gets all three tickets) corresponds to all 3aets consisting entirely of tickets bought by person i. There are ('0) such sets, #0 the desited probability is (3) 10x 4) ora, ey b) The event (there ate three different winner) isthe disjoint union of the ('0) equally likely events (the winning tickete were bought by pezadns i, j, and k) (i,j, all diffrent). ‘The event (the winning tickets were bought by persons i, j, and k) consists of ('2) x (9) x ('*) 3-sets, so the desired probability is ) By subtraction, 250464. Just to be sure, use the above technique to obtain the desired proba- viity: fan to xox GN) 2) 9. a) Let Ai be the event that the 1* sample contains exactly & bad items, i = 0, 1,2,3,4,5. And let B, be the event that the 2° sample contains exactly j bad items, j = 0,1,...10. Then ‘P(2nd sample drawn and contains more than one bad item) 5) =P [ui22Bs | Ai] PAL) 29 250464. = Plane Section 2.5 (Ar) {1 — P[Bo Uv Bs | Ai)} (Ax) « {1 — P(BolAr) — P(Bi/Ai)} MG) {,_ OG) OG) “oO {! ) @ } 0.491337 - (1 ~ 0.079678 ~ 0.265592) 0.282400 8) P(lot accepted) = PUA U {Ar (86 21))] = PlAo) + PCAs) +P (Bo U Bs | Aid -@)G) , OC) {9@ gol @) = 0.810563 + 0.431337 - (0.079678 + 0.265592) = 0.459491. @) (3 20, Every sequence of ky good, fy bad and ky indifferent elements has the same probability: (GIy (BINY? (T/NY* ‘This is clear if you think of the sample as n independent trials. The probability of each pattern of ky good, ky bad and ky indifferent elements is a product of hy factors of G/N, ka factors of B/N, and I; factors of I/N. The number of diflerent patterns of ki good, kz bad and ky indifferent elements is nif (lat ‘which proves the formula, 11, The set is (9 : max{0,n—N-+G) <9 a. 12, There ace (°2) = 2,598, 960 possible distinct poker hands, 8) There ate 52 cards in a pack: A,2,3,.10,J,Q,K in each of the four suits: spades, clubs, diamonds and hearts, A straight is five cards in sequence. An A can be at the begining or the end of a sequence, but never ia the middle. ie. ,2,3,4,5 and 10,J,Q,K,A aze legitimate straights, but ,A,2,3,4 (a ‘round-the-corner™ straight) is not. So there are are 10 possible starting points forthe straight flush, (A through 10) and 4 suits fo 40 hands ‘Thus P(straight flush} to be in, giving a total of Gy ) Pour of a kind) = "Bs? = fas = 0.000240 = sills = 0.00144 (all same suit) ~ P(steaight flush) 00197 ‘o0001st. ©) P(full house) = 4) P(aush) ©) Pletraight) = P(S consec. ranks) — P(straight flush) = seaGht gn, = oonae2 (He HB, = 0.0211 8) Peon pie) = SIAL as, — gous “e) i) Since the events are mutually exclus 1 sum of (3) through (h) = 0.501, ©) P(three of a kind h) Plone pair) = , the probability of none of the above is 30 1s. Section 2.5 1 = P(fail) =1—[P(> 10 defectives)?” peas (sae) m1 9(-2.98) = 9086 P(pass) = 1 ~ (.9986)? = .0028 P(pass) 31 Chapter 2: Review Chapter 2: Review a. a) (P)(/6)*(5/6)* ») (Aars)*a/sy* ©) aitn/s a = @ 2. 0.007 a 2) POQH) = POSH|s spots) P(S spats) + + P(SHI6 spots) P(6 spots) e i@ans + (aL) + G)la/2)° + G02)" x E a (e+e t Bt) xb ») spots) P(4 5 aryass PU spot POSH spe yee pots) apron 4. P(oxactly 9 tails | at least 9 tails) ‘Plexactly 9 tails and at least 9 tails | at least 9 tails) Plexactly 9 tails | at least 9 tails) arlene = 19/0. + a) (7/10)* ~ (6/10) b) The six must be one of the four numbers drawn. 1¢ remaining three numbers must be selected from (0, ..,5). The dested probability is therefore (8)/( 7. ke 1025 8 Die (Ce) are e/ey-*}” 9.3) 40% b) POH of kids = 2] at least 2 girls) *PeTem) & * _ xe, = PS Tes) So z= 3 is most likely. ©) PUG,3B & pick G) +P(2G,2B & pick G) +P(3G,IB & pick G) Cn nel we atie at ie 4 10.) We can use the binomial approximation to the hypergeometric distribution with m = 10, p = 15. So 1—(1~.15)!® = 8031. And the histogram will be that of the binomial (10, 0.15) distribution, b) No. Presumably some machines are more reliable than others. Then results of successive tests fon a machine picked at random are not independent. So the independence assumption required for the binomial distribution is aot satisfied, 32 = 0.4975 Chapter 2: Review aL pat apa Pbsd) = Fx + dX Gap 0133 rea tn ot ot) = (2c a8)" = aon ‘Assume items are bad independent of each other. Reasonable since both A and B produce a large number of items each day. 12. a) 0.423 (See solution to Exercise 2.5.12). 'b) Want chance of at most 149 “one pair's in the first 399 deals D (Peaster 168.78, ¢ = 9.87. (1.95) = 0258, «) Use normal approximation: Want (2 18, The number of “dud” seeds in each packet has the binomial (50, .01) distribution, which is very well approximated by the Poisson (.5) distribution. The chance that a single packet has to be replaced is therefore Le (1 45 + 57/2) = 14d. Assuming that packets are independent of each other, the number of replaceable packets out of the next 4000 has the binomial distribution with parameters 4000 and .0144. This gives 4 = 57.6 and @ = 1.535, which i much bigger than 3. So the normal approximation will work well. The chance that more than 40 out of the next 4000 packets have to be replaced is very close to 8 14 (255848) 2101-929) = 9¢229) = 9001 14. a) 1/58) 2/9 Ones gh = SARI, ce inca 15.9) (oats (0.1)? (0.2)*(0.3)8(0.4)* 6) POSUH ball i ed and there ace 2 te balls inst 24 rams) -orx (*)earoar ro. ay BP by hy a9 SB - (8) x hy ny HQxG«2 o@x2 as. a) ()(0/6)°(5/6)" by 6xsx 9 GE aex ©) Ploum > 9) =1~ P(sum < 9) = 1 = Plseven I's) ~ P(six 1's and a2) ays)" = 70/8y" 19. 9) (2/3)*s —b) (;)(2/9)*(1/3) + (2/a)* 20. 8) Diao)" b) (0.99}" + 8 0.01(0.99)" + ($)(0.01)7(0.99)*. 21. For m odd, OG" (sath 33 Chapter 2: Review 22. a) Assume each person who buys a ticket shows up independently ofall others with probability 0.97. Ifn tickets are sold, then the number of people who show up has binomial (n,0.97) distribution. Find n so that /.97)(.03)n > 1.645, which implies n < 407. ss Perna =o (nese 23. Plat least one gi | atleast one boy) = P(at least one git and at east one boy P(at least one boy) P(not all girls or all b © (Rx TPA) + (4 BA) + (2 & 8) + STB) (4 x1/2) + (2 x 6/8) + (4 x 14/16) CEs DETCESOETCES/DETOESUIOM 24, Denote by #(n), n= 0 to 5, the proportion of families having n children. a) We assume that in every family, every child is if a family is chosen at random then ually likely to be a boy or a girl. In this case, P(family has n childeen and exactly 2 girls) = P(n children) P(exactly 2 girls | m children) =x (2% (family has exactly 2 giels) = Yx() £ = 200125. (interpret (3) as 0 ifm < &). Hence 2 ') Since each child is equally likely to be chosen, we have #such children child comes fom a family having exactly 2 gis) = Beech eS Suppose the population has N families, N assumed large. Denominator: There are Nx(n) Jomilies having exactly n children, so there are nix(n) children belonging to n-child families; hence there are So*_ nVx(n) children in the population. ‘Numerator: In patt (a) we showed that the proportion of families having n children and exactly 2 girls is x(n)(3) 2x. Thus there are nWx(n)(3) X children who come from families having » ‘children and exactly 2 girls, and OS, nWVx(n)(%) sk children who come from families having exactly 2 girls ‘Therefore the chance that a child chosen at random from the children in this population comes fom a family having exactly 2 girls is (=) (5) 3 nen) = .294207, 25. a) PCwinin3 sts) = Pin all eee win neatly 4 ets) Pin ov of ret thee, then win fart) = @)s%e- = pte (win in exactly 5 sete) = P(win 2 out of fst four, then win Rlth) = (S)p¥@? -p = 694? 1) P(player A wine the match) = POorins in exactly ‘3 sets) + P(wins in exactly 4 sets) + P(wins in exactly 5) =Piaeat ope P 34 26. 21. 28. 29. Chapter 2: Review °) Plc ast 3 ws no) = PUA no ion 8) Ea 1 P+ipPat org? = 1+ 3+ bq? 4) Hp =2/s the anor ing om. 6) Hori player A ase the St woe he may be nevus abot ving the mate, a tho od ets peormanc. Tn ge the sromplon of ladpendne of pers peruse aecaie fun rte pet 9 9/(9) 1b) Using the inclusion-exclusion formula, the probability in question is P(ALBUBLCU---U J&A) P(ALB) + (BEC) +--+ P(JKA)—S> P's of two pairs) +> P(e of thee pairs) — (3) -m(!) Decause the probability of each intersection of 3 or more paits is 0. 132 Assume that every quadruple of 4 exam groups appears equally often in the population of students, to that the proportion of students having & specific et of 4 groups is 1/('*). By equally likely outcomes: ‘There are (*)3* quadruplets which correspond to different exam days (count the ways to pick the 4 days, then the waye to pick an exam time from each of those 4 days). Hence the desired proportion is (*)3*/('*) = 3971. By conditioning: Pick a student at random. Let D; denote the day of the ith exam. Then P(D1, Da, Ds, De different) = P(D1, Da different) P(Dy, Da, Ds different|Ds, Dz different) P(D1, Da, Ds, Ds different|Ds, Dz, Ds different) Ui 23 Sc x B= 2071. by inet What kinds of throws are wimpouts? Firstly, they cannot have any numbers in them, Of the the throws with only letters the only ones that contain a W either have four different other letters or 2 ‘or more letters of the same type. Both of these are scoring throws. Thus the only throws that do not score consist entitely of the letters A, B,C, D with no more than two of any letter. Con dice with the W because it has only A, B,C. Suppose it is A. Then the outcomes of the other dice that lead to a wimpout are: ABCD, ABCC, ABBC, ABDD, ABBD, ACDD, ACCD, BBC, BBCD, BBDD, BCDD, BCCD, CCDD. Taking into account the different possible orderings and multiply bby 3 we get a total of 450 outcomes. Since there are 6° = 7776 equally likely outcomes possible when you roll § dice, 0 ASO 5 = 0.05787 P(wimpout) = AS = 25 Draw the curve y = log from 1 ton, Between z and z+ 1, draw a box with height log. (The first “box” has 0 height.) Connect the upper left corners of the boxes by straight lines, The area under the curve is the sum of three parts; the area of the boxes, the area of the triangles above the boxes, and the atea of the slivers above the triangles and below the curve. The area under the curve is [ee nloga —n +1105 [(2)"] +1 35 Chapter 2: Review a1. 22. 33. ‘The area of the boxes is Bree =tos(n— 1 yy Hoste +1) log) flosn By moving the aivers so that they all have thei right ends on the point (2,4og2), you can see that none ofthe slivers overlap and all ofthe slivers fit in a box betwoen 1 and 2 with height log2. So the area of the slivers is some number eq Mim) = P(Yints > Mm | your m+ Ist toss is head) x 1/2 + Plies > Moe | your m+ Ist toss is tail) x 1/2 P(Yin > Mn) 1/2 PY > Mim) x 1/2 = Pl¥in & Mm) x 1/2-+ P(¥on < Min) 1/2 by symmetry 12. ae ree: b) The SD is ¢ = 5.06. Use normal approximation. a5.) (zy (1.815) ~ &(~1.347) 965 — (1.9115) = 0.8765 36, ¢) Asn 00, H(A) is asymptotic oe“ HI-"9)"/08, Hence if H(K) <¢, then k satisfies appro mately eHerione ce 2 - np)? /npg < loge (09) > torre? cap ponerog! or > wp + yfanpatog So the a,® such that a s] ) Suppose you have n-+m objects, n of which are red, and m blue. You want to choose bout of the atm objects. Among the k selected objects, some j will be red, where j could be 0,1, 2-1-1 So you could just as well pick j red objects fst, then (K ~ ) blue object. 4) This is similar to 8). ae) 12, a) If we think of Wi as just counting the number of times we get category i in m trials, we don't ‘care what happens when we don’t get this category. We get category i with probability pi, and don’t get it with probability 1 ~ ps, so the distribution of Nis just binomial (n,p:). ) Similarly, Ni-+ N; counts the number of times we get either category i of j, which happens with probability pi + pj, so the distribution of Ni +N; is binomial (n, pi + ps)- ‘) Now we just consider the three categories i,j, and everything else, which gives a joint distribution which is multinomial (n, pe, j,1 — pi Ps). 1. a) tn -2 20 ae 2n- 2-1) on P(X > k) = P(first k balls are different color) » log P(X > &) = Ylog(*=4) = Slog(1 ~ jm) = J - s/n = [—log(4)]20. Roughly ? = nlog4, so = Yalog’. And k= 1177 for n= 10°. So K(k ~ 1) M4. a) (75*) pg! - p= (95!) pte for 9 = 4,5,6,7, bh Osea ©) P(A wins) = 1808 / 2187 AL Section 3.1 4) The outcome of the World series would be the same if the teams played all 7 games. Let X be he number of times that A win ital T games are played. IX >v4, then A has won the World series, since B ean have won at most 3 games. And if A won the World Series, thea A won four ‘games before B did, so X > 4. So P(A wins) = P(X > 4). Of course X has binomial (7, ) distribution! ©) G has range (4,5,6,7) (A wins in g gemes) + P(B wins in g games) +54 tet P(X =n, ¥ =n) n) (by independence) ‘bec) Notice that by symmetry, P(X > ¥) = P(X <¥). Moreover, = PIX >Y)F P(X ) +1/n. So P(X >¥) = 4a = P(X <¥) a) P(max(X, = PX = Ba ) Plmin(X,¥) = 4) = Plmas(n $1 Xn41-¥) a ntink) where 4° and ¥* are copies of X and ¥ respectively. £) Note that the distribution of X + ¥ is symmetric about m +1 (recall the sum of two dice). For 2ek mire + pens = P(S = 2) + P'S = 12). <) Suppose the values are equally likely. Then from (b) and calculus, 1 > 24411 > 2; contradiction! 4) Yes; for example, let X take values 1, 2 with probability 1/2 each. and Y take values 1, 3 with probability 1/2 each. ‘Then X +¥ has uniform distribution over 2, 3, 4, 5. 20. No. Counterexample: Let X1, X2, Xs be the i ieators of the events H,, Hz, $ in Example 1.6.8. 21. Yes. Proof by mathematieal induction. 22, a) P(X 2) =, P(X Similarly P(Y = y) = 9(¥) 0, S(2)- b) I P(X =2,¥ =y) = f(z)g(y), then PUX = 2) PY =) = (2) Dj 963) 9) D0 = f2N900) (Dy 9G) (D1) - So we just need to show (52, o(4)) (E40) » x Px >> Ho) = (= 10) (Ea) Note: In this calculation z and y are fixed. The sums are over dummy variables which are not called z and y, to avoid confusion with the fixed values. ¥ =v) =D, fe )aty) = £2) Dy 910) 23. IX <7. then ¥ }. Consider LaQtat Parte te toner tsbet er tartar tres eal Notice that p(q ++) = pq + pr < 1/4, alp +9) = ap +r $ 1/4, and.r(p +9) = ep tra S 1/4 ‘The probability in question is thus Siplva + ve] + lon + ar) + rp + ral) 21-3 g tye 1-3 = 1A 43 Section 3.2 Section 1 2 3. cS 5 3.2 ISX 1425 x.2450x.7= 41.5 Average of list 1 = 1x .242%.8= 1.8; Average of list 2=3x.545%.5= a) 1.8.44 =5.8 (distributive property of addition) b) 18-422 ched) Can't do it: need to kuow the order of the numabers in the two lists. E(# sixes in 3 rolls) = 3x $= 5. E(# odd numbers in 3 rolls) = 3 x } = 3. 1£ 25 of the numbers are 8, then all the others must be 0, since the average must be 2. If 26 or more of the numbers are 8 of more, there is no way the average can be 2, since some of the other numbers ‘would have to be negative. Suppose he bets on 6, and let N’ be his not gain. £ | Pete's) |W | nx PIN =n) 37 are 3 [| 3xq ° 2 1 ae/arsy | 2 (2-35) 76 36/6176) | 1 | _315/6) 1/6) of ee [| -Gey (078705, and the gambler expects to lose about 8 cents per game in the long run. ‘Therefore B(N) = X= ht Jo-t---4 Jr where J; i an indicator random variable indicating whether the ith card is a Spade, Then E(X) = TP(first card is a spade) = E E(X) = D7 ps by linearity of £. No more assumptions required. E([(X + Y)?] = B(X?) + 2B(XY) + B(Y?) = 17. EUS — YY = BUX?) — 2E(XY) + BY?) = p29 +r a) Write ¥ = (a+ Ja)? If [4 =0 and 5 = 0 (this occurs with prob (1 ~ P(A)) (1 P(B)) ), then ¥ = (0-+0)%. IT, =1 and Ip =0 (this occurs with prob P(A) - (1— P(B)) ), then Ifa = 0 and Ip = 1 (this occurs with prob (1— P(A)) » P(B) }, then If Ia =1 and Ip = 1 (this occurs with prob P(A) P(B) ), then ¥ = (141) So = PCA) - C= PY) P(A) (1 — P(B)) + (1 ~ PCA) (PCB) P(A)-(P(B)) b) Use properties of expectation described in this section: Note that ¥ = (Ia-+la)? = Ia+2/an+la (using 2? = 2 for z =O and x= 1,014 = 14 and I} = Iu) s0 EY) P(A) + 2P(AB) + P(B) P(A) +2P(A)P(B) + P(B). Altern ely (and this is more cumbersome), use part (a) and the basic definition of expectation, 44 Section 3.2 LL, Use Markov’s inequality: P(at least one win) < expected # of wins Or you-can use Boole’s inequality. Actually, P(at least one win) = 1 ~ P(no wins) = 1 ~ 22%. 28 ‘The bound is close because the bound pretends the events (ith ticket wins) are mutually exclusive. Well, they almost are, because P(more than 1 win) is tiny, 22. E(x) Trt) s WME) Gelnce PX So) =1) = peas Similarly you can show the other inequality. 18. a) By Linearity of expectation, 10 x 3.5 = 36. b) Let X1,X2,Xs be the theee numbers, and let Min denote the minimum of the first three numbers. We want E(X; + Xo +X — Mi asin Example 9, = BUX + Xa Xa) ~ E(Afin) = 3.5 ~ E(Min). Exactly E(Min) =a nt 0s a tas +40 ere a +d +e where gm = P(Min > m). So the required expectation is 3 x 3.5 ~ 2.042 = 6.458. ¢) Let Maz be the maximum of the numbers on the first five rolls. For each m between I and 6, P(Maz (Maz 6, then xb will be the profit, since there are only 8 items to be sold. b) Write p(y) = PCY = 9). Then ¥(b) = EUL(Y,B)] = > f-xy ++ (b=) ply) — > worl) = vt +6 pla) + So xtply) — 2b Ea Ss = (A+) 6 - v)rly) ~ xb. ow to minimize overall integers 6? Method 1: Argue that if we buy B items, then the expected loss will be = (+8) D> b= v)n(y)- 2b. @ ‘hile if we buy b— 1 items, thea the expected loss will be 6-1) =Q4") Do O- = spy) — (01) ‘Therefore 10)= 10-2040) D wh) 3 A+ P(Y x/Q+ 2). Argue that PY 12> #(b 41) > rfba) ly) — Q4mPy r(bs), and if b > be, then +(b) 2 r(be). Hence r attains its minimum over real b at be (and possibly elsewhere). BUX) = 96 1G. a) P(X, = k) = P(X: = k) = P(fst k cards are non-aces, next is ace) b) Xi tb Xa b Xa Xe t Me $4 252 = 5B(Ki) = 48 => E(X: 46 at. 18. 19. 20. 2. Section 3.2 ©) No, For instance, P(X: = 30, Xz = 30) = 0, but P(X P(X = 30) > 0. 3) P(D <9) = P(3 red balls are among the first 9 draws) ») P(D = 8) = PD <9)- PD <8) = Ge) — Ge). ) Label the biue balls and the green balls, ay, b, .., bio. Then D = 34 5!2, I(t is drawn before the third red ball), 50 BUD) = 3+ S'2, P(bi is drawn before the third red ball) = 3+ 10x 2 = 10.5. Solve the system of equations 26a) + 7) ap(2) + bp(}) for p(a) = £53 and p(b) = $24 Let 2 = # blues. Then # reds 2, ftw = 2, #9 = 32. So =i 8) P(X > 4) = PUX = b) B(X) = Bh) + Efe) + Elle) + BU) where eg. Ty =0-@)+0-G)+0-G40-O9) Write p(s) = P(X = 2). Solve the system of equations 2(0) + (0) +902) = 1 11) +2902) = 2(0) +4902) = wo for p(2) = ©2542, p(1) = 2u1 — wa, p(O) = 1— (#2544) — (24 — pa). a) The indicator of A® must be 1 when A occurs and 0 otherwise; clearly this is true for 1 = I4: Azle ly =0@1-14=1. ee 4) 4) (F) tintin GY PG) indicator “green appears” b) The indicator of AB must be 1 when A and B both occur and 0 otherwise. Since Jag = 14 Ua = band Ip = 1) @ [ale = 1 we have Lan = Tale. ©) We wish to show that the indicator of the union can be found by the given formula, which can bbe shown by the following application of the rules in a) and b). Tauaguouns = Hagagoasye — Tasag-a8y = (atlas Ta8) ~V~ Tay) = Las) = Tan), a PIA) = BU avas-vae) = B= (= LaLa) (0 = Lag) = BQ a = (Sada) te (ag Lag) = BU a) — (SO Ba tay) 4 (HI BU bay 1) = SPA) = DO PAAL) HE PAL An) 47 Section 3.2 22. a) Let [yz be the indicator of a run of length 3 starting at the ith trial. Let P(3,4) = Ea) be the probability of this event. Then for n > 3 DAs) = PGA) = PO,)4 9 PO.) +POn~2) (1 —p) + (r — 4)(1 = w)(@°)(1 ~ 9) +1 = pp? = WC ~ 9) + (n= 4)(e)(0 ~ PP (Ron) = EQ Iai b) Similarly, let Ja, be the indicator of a run of length 3 starting at the ith trial. The for m 1. Thus El Ra) = 9+ (m — 1)p(1 ~ p) 48 Section 3.3 Section 3.3 2303 . 7 Ld spss haar «X= 3042, s0(X) = 0.86. z 28 30 31 . 2) sates bales Weyaes ax af Tx aiaas MUX) = 9044, SD(X) = 0.88 2. B(Y*) = 3, Var(¥*) = 15/2 3. a) BQX +3Y) = 26(X)+36(¥) =5. b) Var(2X + 3Y) = 4Var(X) +9Var(¥] 9) E(XYZ) = [E(CXMECME()) = EC} 4) Var(X¥Z) = B(XYZ)?} ~ (E(XY2)P = E[X?Y7Z] ~ [E(XYZ)P OPEV" B12") — (BO EY BZIP (XP — (EUXIP = (Var X) + [EUOPP - EGO = 26. 4, Var(XiX2) = B%1X2)") ~ [EX Xa) = E(X4)- (9) ~ [8(X: Xa)? = (oh +02) (8 +08) ~ (Hn? = lol + uo} + ote} 5. By the computational formula for variance, Var(X — a) = E[(X ~ @)"] - (BX - a)? = E[(X ~ 4)"] - (#0). But Ver(X - a) = Var(X) = 07, so E[(X 6] 07 +(u a), ‘Thus the mean square dis mean square distance is mi this case, B[(X a] =? between X and the constant a is o” + something non-negative. The ined if this ‘something non-negative’ is actually 0, ie., a= 4. And in Var(X). 6. Intuitively, 1 and 6 are the two most extreme values, so increasing the probability of these values should inctease the variance. Var(Xp) = > 2*P(Xp = =) = (E(Xp)?? ey 2 al 2 2P as) =rbeetety lor elee slot yeh (sy 6p BP 47 | 44941 Oss 7 T,X =X + Xa4 Xe, where X, has binomial (n,, p, ) distribution (a) Not It is binomial only when the p,’s are all the same, (b) BX) Var(X) = D2, spite 49 Section 3.3 Ba)N=K teks WAM Hbsttbe gs <¢) Note that WV is the indicator of the event Ai U Az U As, since Ai's are disjoint. So Var(W) = (5444 D0 $= $= = 2h 4) Var(W) = Var(Xs) 4 Var(X2) + Var(Xs) a «) Var(w) = [ -p-PeG-b-v]-G4h ep = Be 9. The number MN; of individuals who vote Republican in both elections has binomial (r,1 ~ pt) ists bution, and the number Nz of individuals who vote Democratic in the fst election and Republican in the second has binomial (n ~ 1,92) distribution. ‘The number of Republican votes in the second lection is then Ny + Na. So BUM, + Ma) = BUM) + B(Na) = r(1 =m) + (np Since Ny and Ny are independent, we have Var(Ny + Na) = Var(Ni) + Var(N2) = r(1 — pi)pr + (nm — r)pa(1 ~ pa) 10. a) BUX) =D, eAP(K 2) = hh Le Re ent) = alk mln, [+08] = Doane HP = a) = baat eet (nt »b) By the binomial expansion, EX*14 (8) X47 4.0.41 =(X-41)*—X*, 80 eles (7 seta] <2 [erent] 20% Mette (ne 142 be tet (nga) 6) Put k= 2 in b): B(OX 41) = P= on po me BCX) = BBL By a), a(lsn)/n = £(X). So (x) = S84), 4d) Put k=3 in b): E(GX? +3X +1) =n? 4an43. + P(X) =F [o? 4.3043 — 38K) — 1] = [wn 49 = MED — 1] = Bw NGM 4 1). By a), 42 = E(X?), So 2(2,n) = En(n+ 1)(2n +1). ¢) Var(X) = E(X?) ~ [EOP = f(n- 1)(2n +1) = ((n + 2 = f) For n = 6, E(X) = 7/2 agrees, and Var(X) = 35/12 agrees. g) Put k in b): . EX? 46X? 44K 41 Use part a) to conclude s(8,m) = EX?) = 2 (n? 4 an? + 6m $4 — GE(X?) — 48(X) — 1) 3 fn? 4 an? 460-44 = (wb 1)(2n-4 1) = 260-41) AL. ¥ =(0—4) +X, So E() = 004 06(X) =a b eb BEE 1 2 Section 3.3 12. a) According to the Chebychev inequality, 3 P(X > 20) < PIX 10) > 10) < 5 A. Von(l = 9). Tey 04 2.8. This is imposst b) IFX has binomial (n,p) distribution, then £(X) = np and SD(X) and np(1—p) = 25, which implies that 1 — 10, 13. Let n denote the number of scores (among the million individuals) exceeding 130, and let X denote the score of one of the million individuals picked at random. Then P(X > 130) = = n= 10°P(X > 130), ie ae 2) We have, by Chebycher’s inequality, P(X > 130) = P(X — 100 > 30) < P(X — 100] > 30) < ¥s4%0 Son <10'x $< uni. ) Since the distribution of scores is symmetric about 100, P(X ~ 100 > 30) = P(100 ~ X > 30). Therefore P(X > 130) = P(X = 100 > 20) = P(X — 100] > 30) < and n 10° x 4 < 55556. 6) POX > 130) = PAGE > 3) 1-409 0.0013, hence n = 10* x 0.0013 = 1300, Note: We can get a sharper bound in a) using Cantell variable with mean m and variance ¢?, then for a> 0 inequality, which says: if X is a random P(X —m>a)< FF Proof of Cantelli’s inequality: Without loss of generality, we may assume m = 0, ¢ that if X >a then (1 +X)? > (144°), So Note where Ione i the indicator of the event (X > a). Take expectations, and get the result, P(X > 130) = P(X — 100 > 30) < saiay = 14. a) Let X be the income in thousands of dollars of a family chosen at random. Then E(X) = 10 and, by Markov's inequality, Application to part P(x 2 50) < AC) 1. b) If $D(X) = 8 then by Chebychev’s inequality P20) PiX-v0i>s-N< bad B 15. b) 10V8 1s. a) ofa PR ==) 25 | 25 Thus 6(X) = 218242 = 0 and Var(xX) = SBPMay?s0%49" _ G2 b) Let Sq = DT, Xi where each X; is one play of this game. Then we wish to know P(Sioo > 25). E(Sico) = VO0E(X) = 0 and Ver(Sioe) = 100Var(X) = 350, Furthermore, the sum of 100 draws should look pretty close to normal, so Posi 2 25) 1-0 (% Section 3.3 1. BUX) 1/4, SD(X) = LVF; B(5) = 6.25, SD(S) = 8 8) PS < 0) = 0 (28598) 05. b) P(S = 0) wa (gaatat) — 0 (285928) 22.09. 9 P(S > 0) 1-6 (gaaaae) 92, 18, Let X; be the amount won on the ith bet. Thus X; = 6 if you win on the ith bet, and X; = —1 if you lose. Let S be the sum of 300 dollar bets, and we wish to find P(S > 0). 5 a 7) = 6x S41 x B= 0.07895, E(K) = Ox Ft -1 x E = 0.0789 SD(X:) = 1/36 1898)? = 2.36 5 E(S) = 300 x ~0.07895 = 23.68, Eo SD(S) = VI00 2.36 By the normal approxim: 40.98 P(S > 0) 1-casn) 0228) «gon 19. Let X; be the weight of the ith guest in the sample, and let S = Xi +X + ‘P(S > 5000). Use the normal approximation, with £(S) = 4500, and 50(5) = V0 x55 Need the area to the right of (5000 ~ 4500)/301.25, 1 ~ (1.66) = 1 ~ 0.9515 = 0.0485. + Xoo. Want 01.25, .66 under the standard normal curve. This is 20, a) The probability that your profit is $8,000 or more is the probability that your stock gives you a profit of either $200 per $1000 invested or $100 per $1000 invested: this probability is .5. b) Let Sioo be the sum of the profits for the 100 stocks invested in, and let X be the random variable representing one of the profits. E(X) = WEHOOIO-100 = 59 and Var(X) = MettIOHP 40% 44—100y* _ 59? — 12,500, E(S1o0) = 1006(X) = 5000, and since the investments are independent, Var(Sioo) 1,250,000. Using normal approximation, 2000 ~ so — 5000 2950 ~* ("Seamer") —*-* ie) P(Sieo 2 8000) Jn the numerator of the normal approximation term, note that the 8000 50 is just the familiar continuity correction since Sioo can only take on values which are multiples of $100. 2. a) Let X; be the error caused by the ith transaction; thus X; is uniformly distributed between —49 and 50. Let Sigo be the total accumulated error for 100 transactions, and then we wish to know P(Sico > 500 oF Sree < 500). We have sao Sb Var(X: So E(Siea) = 50 and Var( Sic 83,325, Using normal approximation, PUSiml > 500)=1~ (# (SOR SES) 0 ( 52 bb) Using the same notation as in a), we now have that E(x: ‘except that the coefficient when j Var(X:) Thus £(Sic0) = 37.88 and Var(Sico) P(|Sioo| > 500) = 1 — ( (2 22. a) Let X; be the face showing on the ith rol; then E(Xi) = 1/2, and SD(Xi) = V/35T2. Let Xn be the average of the fst roll. Then B(Ra) = BUG) = 7/2, and SD(Xq) = SD(Xi)/ Vi = S35/A2H, and 5 1 = ? 05 ko) = P(X = w+ ke) + P(X =p ko) =2- gh =, ‘which isthe Chebychev bound on the above probability. ©) PUY = ul <0) = 0 = PUY — ul > @) = 1. So absolute deviations from the mean are > with probability 1. On the other hand, the average of squared deviations (variance) = 0”. So the deviations must actually equal 22. Suppose P(Y ~ = a) =p, 20 P(Y — = ~2) Then 0 = BUY ~ 4) = pot (1 9)(-2 = 1/2, That is, P(Y = 4 +6) = P(Y = #0) = $, which is the above distribution for 26, a) 1.5) Note that ‘ar(|X = nl) = E(IX ~ pl?) ~(E(IX — al). From this, the result follows immediately. ‘That the ‘equality holds if and only if |X — p| is a constant follows from the first inequality. 27.) Use the fact that expectation preserves inequalitis: () 0 X 12 0 A(X) <1. (ii) Note that X? < X, so E(X?) < E(X) and. 0 < Var(X) = E(X) = (BOOP ¢ EX) ~ (EOP = w= 0? « “The inequality 4 — a? < 1/4 follows (rom calculus, or from completing the squate: == st 55 Section 3.3 b) The results are trivial if @ So assume a 0, s0 9X — X? = 0 with probability 1 (see below). Hence X takes values only 0 or 9, Conclude from E(X) = 9/2 that P(X = 0) = P(X =9) = 1/2, that is, that half the list are 0's and the rest are 9's Claim: If Y > 0 and E(Y) =0 then ¥ = 0 with probability 1 Reason: Suppose not. Then P(Y > a) > 0 for some « > 0. But by Markov's inequality, we have P(Y > a) < E(Y)/a = 0, contradiction! 28. Var(S) =, pi( — pi) = np(1 — p) - Dlr: - »)?- 29, Dy = (Di +--+ Da)/n, where each Di is uniformly distributed on {0,...,9}: P(D. {for each B. So E(D:) = (1 +2+---+9) = 9/2. ‘To get the variance of D:, note that D; +1 is uniform on {1,2,...,10}. So by a previous problem (Moments of the Uniform Distibation), Var(D:) = Var(Di +1) = 235 = 33/4 and SD(D. VB. a) So guess int (9/2) = 4. b) [fm =1, then the chance of correct guessing is P(D, would be the same no matter what we guessed. Heino 1/10. The chance of being correct, If'm = 2, the chance of correct guessing is P(A < (Di + Ds)/2 <8) PUD + Da or 9) = P(Di+De = 8)4P(Di4 D: Note that the distribution of Dy + Ds is triangular with peak at (Dy + Dz = 9), so guessing ‘Dz =4 indeed maximizes the chance of guessing correctly. For large n, the standardized variable Z, = 2Y10, distribution. So the chance of correct guessing is ava Dn 21 < v3) ‘The normal approximation now gives 33: PAS Dy <5) = PUIZI <1 65: P(A < Dy <5) = PUIZIE V2) has approximately standard normal P(izls VS) Plas Da P(lZnl < /n/33) 20.99 <> /n/33 > 2.58 <> n> 220. 30. D} takes the values 0, 1, 4, 9, 16, 25, 36, 49, 64, 81 with equal probability, so the X; are independent ‘with common distribution z ofilals|els PR t[2}2[4|2[2 So E(X:) = 9/2 and Var(X:) = 9.05 => SD(X:) = 3.008. a) By the law of averages, you expect X,, to be close to E(Xx) b) P(IXn — 4.5] > €) = P (Sigg! > 2s) ~ P (I21> 2s) = 2” (2 > a)» where Z has standard normal distribution. For n = 10000, we need ¢ such that P[z> | Aherefore ¢ = 2.81 x 3.008/100 = 0.085. c) Need n such that P(|Xn— 4.5] < 0.01) > 0.99 i.e., 4.5 for large n. So predict 4.5. 0025 + 9M = 2.81, P [121 < 4x21] 2020 gS 2258 therefore w > 602276, a) We have calculated (Xi) = 9/2 and Var(Xi) = 9.05. From the previous problem, we have B(D:) = 9/2 and Var(Di) = 33/4 = 8.35. Since D; has smaller variance than does Xi, the value of Dy can be predicted more accurately. «) Since E(Xigo) = 4.5, you should predict the first digit of Xo to be 4. ‘The chance of being correct is Pla < Rio <5 w P (|ERgcen| < & per> e+) ©) BW) = B(P 1) = £(r) Var(T = 1) = Var(T) = o/s. ae. 4) Var(W) Ta) Use the craps principle. Imagine the following game: A tosses the biased coin once, then B. tosses it once. If A's toss lands heads, then say that A wins the game; if A’s toss does not land heads but B’s does, then say that B wins the game; otherwise the game ends in a draw. The chance that A wins this game is p; the chance that B wins this game is gp: and the chance of a draw is @?. 58 » 4 8. ab) @ Section 3.4 You can see that A and B are really repeating this game independently, over and over, until cither A wins or B wins, and that the desired probability is P(A wins before B does). By the craps principle, this is 5 = rf Another way: In terms of T, the number of rolls required to produce the fist head, the desired piobability is PUD is odd) = PT =1) + PCT = 3) 4 PUP = 8) to pte ta Pt = hs Apply th rans principe tothe situation in) the dasied probability ie P(B wine before A dos) Fis = thy. OF compute P(T is even) = 38x. Or subtract the answer in a) from 1, because Sheet the layers must see a head sometime Apply the craps principle, this time with the gime consisting of A tossing the coin once, then B tossing it ewie. The chance that A wins ths game jsp, while the chance that B wins is q= @? Se the hans tas A ete he St Bead 7/9), andthe hans that D gt the Sat hed ~ @)/(1= 9"). Note that 1? = (I —9)(1 +a-+4?), 0 P(A gets the fest head) simplifies Saeed gh Alteratvey in tees ofthe sendom arable T the chance the A gels the fot head is the probability that T is of the form 1 4 3k for some # = 0, 1,2,.. similatly for B. Solve slygr = 5 to get g = St and p = 254 = 381966. ep is very amall, then the fact that A gets to toss frst doesn’t confer much of an advantage to A. However, since B tosses twice as often as A does, you would expect that the chance that B ‘gets the frat head is clove to 2/3. Indeed, as q tends to 1: P(A gets the frst head) P(B gets the first head) — Suppose that the player's point Xo is z = 4, 5, 6, 8, 9, or 10. Then P(win|Xo = z) is the probability that in repeated throws of a pair of dice, the sum x appears before the sum 7 does. In the notation of the Example 2, imagine that A and B are repeating, independently, 2 ‘competition consisting of the throw of a pait of dice. On each throw, if x appears, then A wins; i€7 appears, then B wins; otherwise a draw results. The desired probability is the chance that ‘A wins before B docs, which, by the crape principle, is p23 = [2{[slaf{s{[o][7][s{o]{wlutn Play) a/ae [ayse | a7se | a7o6 | 9/96 | 6736 | 9736 | «736 | 2736 | 2736 | 1796 FoiniXe=sy| o | 0 fas laolsul| 1 [slenol ae] 1 | 0 Plwin) = D2, PwinlXo 0 $43 36-24 ig 324 ir 9. Let X be the payoff. Then PIX ‘That E(X) = £(W?) = Var(W) +[E0W)} Your 2) =)" fora = 1.2, X = W*, where W has geometric (p) distribution on {1,2,...} for p = 1/2. Since W has jp = 2 and vasience q/p? = 2. net gain sn by X 10, so B(net gain) = —4. That is, you are going to lose $4 per game in the long run 10. a) Condition on the first outcome and use the rule of average conditional probabilities: Let $= (Gest trial results in success), F = (fest trial results in failure). Then P(X = 2) = PUX = nl5)P(S) + P(X = al F)PLFY Note that PUX = nlS) = PWe 4g n> 2, 59 Section 3.4 where Wr denotes the mumber of asses until the frst failure, which has a geometric distribution on (1.2...) with parameter ¢. Similarly P(X = n|F) = P(Ws =n —1)=4"*p,n 22, Therefore P(X =n) ato"s'p for n= 2, 2) Duplicate the method fr finding the moments ofthe geometric distribution: E(X) = > ng™'p (Lo.»)~1) +9 (LG.a) 1) (csp -1) +9 -1) Le d=1, where P(r) =D, ©) Similarly E(X*) = D2, n?p*!g + DS weap Gen) +e(nea—y) = 0 (ci8fs ~ 1) +0 (cigs 1) = e+ ta. Finally, use Var(X) = E(X?) -(£(X)P - 11, Write ga =1—pasg a) P(A wins) = P(A tosses H, B tosses T)+P(A tosses TH, B tosses TT)+P(A tosses TTH, B tosses TTT) + = page + (qage)page + (gage) Page + b) P(B wins) = ;2428-. (Interchange A and B.) ©) P(draw) = 1 = P(A wins) ~ P(B wins] 4) Let W be the number of trials (in which A and (by either A or B). N has range (1, 2,3, P(N = k) = Plfirst 1 both toss) required until at least one H is seen . )- For k= 1,2,3, ils see TT, kth tr does not see TT) = (gaga)*"(1 ~ gagn)- Or: Bach trial is a Bernoulli tial with success corresponding to the event (at least one Hl is seen among the tosses of A and B). The probability of success on each trial is then 1 — P(no Hl is seen) = 1 qaqa. Therefore N, the waiting time until the Rrst success, is geometric with parameter (I - gage). 12, Write gr = 1 = pisga ~m a) PW, = We) = DZ, PWs = ks = 8) = Siete = pBE = te b) PIM < Wa) = DEL, PW & €) By symmetry, it's GER. Checks (a) + (b) + (e) = 4) Put X = min(W;,W). For £=0, 1,2... we have P(X >) = PW, > k and Ws > k) = PW; > &)P(Wa > b) = ated = (oven)? - Seo een So X is geometric with parameter 1 — qigz = pi + 92 — Pir. 60 Section 3.4 e) Put ¥ = max(Wi, Wa). ¥ has range (1,2,3,..). For a PUY Sn) = P(W; Sn and Wa Sn) = PCW) adil — P(Wa > n)] = (1-97 )(1 — 92). 1,2,3,.. we then have PY =n) = PY £m) ~ PLY Sn— 1) = (1-9 (1-98) - = af 0-2). 13. a) Let g=1—p. Condition on the first two trials, and use recursion, as follows: Let B = (first draw is black), and WB = (first draw is white, second is black). Then P(Black wins) = P(Black wins|B) x P(B) + P(Black wins|W B) x P(WB) = 1x p+ P(Black wins) x 9p “Therefore (Black wins) = (White wins) = 1 = P(Black wins) = 7225 Or use the craps principle: You can imagine that Black and White are repeating over and cover, independently, a competition which consists of each player drawing once at random with replacement from the box. This competition results in a “win” for Black with probability p, and ‘a “win? for White with probability q2. ‘The competition is repeated until one player “wins”: the chance that Black “wins” before White does is p/(p +4"), which is equal to the previously calculated chance. ') Set P(Black wins) = 1/2. Solving the quadratic equation gives p = (3 — v3)/2 = 0.381966 . (The other root is greater than 1.) <) No, 34 is irrational. 4) No player has more than a 51% chance of winning if and only if 49 < P(Black wins) <.51 This happens if and only if 375139206... < p < .388805726.. Reason: Observe that f(p) = p/[1 —(1—p)p] is an increasing function of p € [9,1]. If0 1. Va isa random variable having range {n,...,2n~ 1}. For w 2n — 1 we have 4) = (Vq =k, kth trial is success) U (Vq =k , kth trial is failure) ‘The two events on the right are mutually exclusive. The first event is (Vo =k, Bub trial is success (exactly m= 1 successes in fiat 1 trials, Eth rial is success) and has probability Plenactly n— 1 successes in first k— 1 rials, £th trial is success) = Plexactly 1 successes in frst k~ 1 teals) P(KUA tial is success) = (er)ertaton en = Caietats in Hence PW, sly the second event has probability (! = Caer eet), 1S. a) Let £2 0.and m2. Use problem 1: PCE = km fF >) = SR SEEN = EE = @p = PUP = my 61 poe Da Section 3.4 b) Let F assume the values 0,1,2, pe = (1 ~po)* po. ‘To prove this claim, introduce the tail probabilities ty = PUP 2 k) = pat peas tea + Put m = 0 in the Property to see P(P =kIF > k) = P(F =0) for all b> 0, Which is equivalent to Be = po for all k >0. Use pe = te fast and 1 po = ts to get 82 = & forall E> 0. ‘This leads Lo the solution ty =(41)* for all E> 0 and P= (4)! po = (1 ~ po}tpo, forall E> 0 as claimed a) kD 1 then my ES eat > Ci PO be) I'm is a mode, then m > 0 and Pipa sand gf 2 1. that 11 follows that (r= E-1 gm sie-0)8 So if (r ~1)f is not an integer, then there is a unique mode, namely int ((r — 1)3). I(r —1)$ is zero, then there is a unique mode, namely 2ero. If (r ~ 1) is an integer greater than 0, then there are two modes, namely (r ~ 1)2 and (r ~ 1)8 — 1 17. Let X be the number of boys in a family and ¥ the number of children in a family. Observe that 18. given ¥ =n, X has binomial(n, p = 1/2) distribution; therefore PX = 8) = PU HY = 9) PY = 9) = SS (emrre ~) Set jan-k: P(X =B) Set s = k-+ 1 and multiply and divide by (1 ~ £) JE (ome 9) (ppt ( PUX = 3 Genet E(P)oW 2) where T, denotes the number of failures before the sth success in Bernoulli(p/2) trials. Thus the last sum equals 1, and 20 OP pve, MRS Gaye B2 a) PG =a ° otherwise 62 { (? +77)“ 2.4, Section 3.4 b) G= 2X where X has geometric (p? +47) distribution on (1,2,..). So B(G) = 26(X) = ast ¢) Var(G) = 4Var(X) = tPys 10.) r plus expectation of negative binomial b) Let = numberof hends in m tose Pets <2) = Pete 1 let Nig: be the additional number of trials (boxes) required | obtain an animal different from all previous. Then Ts = Ni +... 4Nn, and Ni is a geometri xndom variable on { 1, 2, .. } with parameter p; = (n—i4+1)/m (i= Ian). The variables A. , Na are independent, s0 Dover =O eae -LF OG Hence on = $001) = (0? Eo k- 9 Ed)” - ) Note that of 0) m1 — W126" we 120+ ws 0.222, »b) Proceed as in a). Let a be the average requited. Want 0.0. Take logs to get a = 44 4. Assume the number of misprints on each page has the Poisson (1) distribution. Thea P(more than § misprints per page] 0037, Assume the number of misprints on any one page is independent of the number on the other pages. ‘Then in a book of 300 pages, the number of pages having more than 5 misprints has the binomial (300, 0037) distribution, which can be approximated by the Poisson (300 x .0037) distribution. Therefore P(at least one page contains more than § misprints) = 1 — ¢~ 9087300 — = 33 = 67, «Assume that N, the number of microbes in the viewing field, i average density in an area of 10~* square inches is $000 x 10" PIN > 1) =1- PW a 8 Poisson with parameter 2. Since the 05 and 0.5 microbes, we have 6. Assume the number of drops falling in a given square inch in 1 minute is a Poisson random variable with mean 30. Then the number of drops falling in a given square inch in 10 seconds will be a Poisson random variable with mean 5. So P(0) = ¢7* = 0.00674 - a) The number of fresh raisins per muffin has Poisson (3) distil per muffin has Poisson (2) distribution. The total number of raisins per muffin has Poisson (5) distribution, assuming the number of fresh raisins per muffin is independent of the number of rotten ones ution. The number of rotten raisins b) The number of raisins in 20% of a muffin has Poisson (1) distribution, so Tne ts .3670, P(no raisins) 8. Once again assume that the number of pulses ceceived by the Geiger counter in a given one minute period is a Poisson random variable with mean 10. Then the number received in a given half minute period will be a Poisson random variable with mean 5. ae Pay= 9. a) PIX=LY P(x PY =2) 09989, 65 Section 3.5 10, ML 12, 1s. aa. b) Note that X +¥ has Poisson (8) disteibution. Therefore P(X + ¥)/2 > 1) = P(X + ¥ > 2) =1- P(X + <1) =1-(e +793) = 8008 6) PIX 9= O36" a) EQQX +5)= b) Var(3x +5) 2 ele] a) X-4Y is Poisson with mean 2,0 P(X +¥ ='4) = e-724/4! b) Again, X+¥ is Poisson with mean 2, £0 E[(X+¥)] =Var(X +¥)+(E(X + Y)P = n+? =6 c) X4¥-+Zis Poisson with mean 3, so P(X+Y +Z. Satya! ‘The total number of particles reaching the counter is the sum of two independent Poisson random variables, one with parameter 3.87, the other with parameter 5.41 (these numbers come from a famous experiment by Rutherford, Chadwick, and Ellis, in the 1920's). So the total number of particles reaching the counter follows the Poisson (9.28) distribution. So the required probability is et fi g.n0 4 gn 4 ay gt) = 04622. 3) mle) = SERIE x? 2.69 x 1082 of) = az) = 5.19 x 10° 297? b) o(z) = 4 719 x 10-* em. Using the random scatter theorem, the number of tumors a single person gets in a week has approximately Poisson (A = 10°*) distribution. Since different people may be regarded as independent, and different weeks on the same person are also independent, the total number of tumors observed in the population over a year is a sum of 52 x 2000 independent Poisson(10~*) random variables, which is a Poisson(1.04) random variable, b) The number of tumors observed on a given person in a year will be a Poisson random variable with rate parameter A= 52x 10-*. Thus 1 P(none) = 1 — e°9%**? = 0,00051986 ‘Thus the number of people getting 1 or more tumors has distribution binomial (2000, 0.00051986) = Poisson (2000 x 0.00051986) = Poisson (1.04) Pat least one tumor ‘The histograms are essentially the same: Poisson with parameter 1.04. The means and standard deviations are 1.04 and V.04 = 1.02 respectively. Why are the answers to a) and b) nearly the same when they are counting different things? ‘The reason is that the chance of a person getting 2 or more tumors is insignificant, even when compared with the chance of getting 1 tumor. 66 Section 3.5 15. (a) The chance that a given pages has no mistakes is “—"{(2U° = ¢~®! = .99005 and thus the ‘expected number of pages with no mistakes is 200 x.99005 = 198.01. The number of pages with mistakes is distributed as a binomial(200, 99005), so the variance is 200 x .99005 x .00995 197. (0) The mistakes found on a given page is distributed as a Poisson(,009), so the chance that atleast ‘one mistake will be found on a given page is 1 ~ <——{(20")" — -~-9 — 00896, The expected rnamber of pages on which at least one mistake is found is then 200 x .00896 = 1.79. ages (c) The number of pages with mistakes can be well approximated as a Poisson(1.99) since 1.99 is, the expected number of pages with mistakes. Let X = number of pages with mistakes and use the Poisson approximation to get PIX 22 1 (POX ) + PX = 1) = 1 (4 1.99 ‘Assume that chocolate chips are distributed in cookies according to a Poisson scatter. Let X be ‘the number of chocolate chips in a three cubic inch cookie. Then X has Poisson (6) distribution, PIX <= Dh 2851 b) Let 23, Za, and Z) denote the total number of goodies (either chocolate chips or marshmal- lows) in cookies 1, 2, and 3 respectively. Assume that marshmallows are distributed in cookies according to a Poisson scatter. By the independence assumption between marshmallows and ‘chocolate chips, Z; has Poisson (6) distribution, Zz and Zs each have Poisson (3) distribution, and the Z;'s are independent. We have P(2, = 0) =e", PlZ2 =0) = PlZs = 0) =e ‘The complement of the desired event has probability P(2y = Za = Za = 0) + Pls > 0,22 = Za =0) + P(Za > 0,24 = Za %=0) sete 16.) 1) + P(Zs > 0, Zs y+ (—emF)(e #4 201 — e*er# and the desired event has probability virtually 1 ent = 6.27 x 10-7 117. Assuming the distribution of raindrops over a particular square inch during a given ten second period is a Poiscon random scatter, then the number of drops hitting this square inch during the ten second period has Poisson( = 8) distribution. a) So P(no hit 0.006738 . 1) Argue that if NM; denotes the number of big drops and Nz the number of small, then Nx and Nz ate independent and Ny has Poisson(3 x5) distribution, Nz has Poisson( x 6) distribution Hence ) = PUM, = 4) PUN = 40/3)" 219 6/3)" _ g o0ar1. 18. a) Let Sq denote the number of survivors between time m and n+1, and let Jy denote the number of immigrants between time n and +1. Then Xngi = Sa + In where Jy has Poisson (1) distribution, independent of Sq, and PUSn = KIXn a= (ja-ate *OckSz. Claim: Xq has Poisson (uSrz.g(t ~ p)*) distribution, Proof. Induction. True for n'='0. If the claim holds for n = m, where m > 0, then, putting A= uD Toll - p)*, we have for all > 0 ( ) = So PlSm = HX = 2)P(Xm = =) PUSm > (je = atte 67 Section 3.5 19, 3-2. = 2)" H 2 eG ~ ait Ss Gey * = eel Se 50 Sm has Poisson distribution with parameter A(I—p) = «og ( a). Since Fs inde- Rendent of Sm and has Poisson (1) dstsibution, it follows that Xmq1 = Sm tlm b sisson Gistribution with parameter + HS 7PR9(1 ~ P)** = # Dyes (1 p)*. So cla lds for aam4l b) Arno, wELX(1 ~ p)! — #, 00 the distribution tends to the Poisson (3) dist sion. 2) (2) = Dien =D ead 8) eyanem, ") ame So BX) =m, BKK =) a0, BEX = YOK —2) Ea tm BOO) = $308 +H) 2 we? 4) BX = #)*] = BOP) ~ SEP) + SEO? Skewness (X) = ptt = Jp ee 20, No Solution 21, ¢) 0.58304 4) 0.5628 ¢) 0.58306 68 Section 3.6 Section 3.6 1 a) fis b) 4/50 ) 4x 199 2a) 1/3 b) 1/46) (19. 12 11 x 10 x 9)/(52 x 51% 50 49 48) d) (4B XAT H 46 245 x 4)/(52 « 81 x 80 x 49 x 48) B. a) 8/AT b) (12x11 x 10x 9 x 8)/(51 x 50x49 x 48x47) CIMA AIMS Isa 1/4 1 feta) ») a 4) 1i2jsfa trp ope | o nm syorpoipe lo o1forfor|o or porpor por €) Ty = 1+ number of non-defectives before first defective, Ts ~T; = 1+ number of non-defectives between first and second defective, and 6 — 7 = 1+ number of non-defectives after second defective. P(T; = m1, 7; ~T; = n2,6—Ty = ng) = .1 where it is not zero, which is a symmetric function, so the random variables are exchangeable. 4) By ), Ty ~T; has the same distribution as T;, so see a). 5. Plone fixed box empty) = (4% Bx? C5) +00)" Var(X) = EX? - (EX? =6(4 1) +0 1, where 1, is the indicator of the ith ball being in the ith box. Thus So 6. a) Consider M =, F(A) = nB(I » Section 3.6 Thus SD(M) = VERT =1 €) For large n, the distribution of M is approximately Poisson(1). Intuitively, the distribution is very much like a binomial(n, 2) except for the dependence between the draws, but as the number of draws gets large the dependence between draws becomes small, and the Poisson(1) becomes 2 good approximation, 1 ada ») (854) BE 8. a) (X= k b) BUX) = 26-4 <) SD(X) = \/¥ 4) POX > 15) = P(X 2 14.5) = 1 ~ 8.824) = 2061 ‘The normal approximation gives a fairly good answer. You can check that P(X = 13) = 21812, P(X = 14) = .18807; therefore the exact answer is, by symmetry, 2028, 9, Let bi,b2,.+..be denote the B bad elements in the population, and define 1 if bad element 8; appears before the first good element 0 otherwise ht htt lp. ‘of bad elements before the first good element 8) E(X ~1) = Bh) + Bla) +--+ Ba). By symmetry, all the expectations on the right hand de are equal, and equal to P(by appears before fist good element). ‘This probability equals 1G +1) Let 41, 92-++.90 denote the G good elements. Consider the G+ 1 elements 91, $2, =. 9a, 1h, We are interested in the position of by relative to the g's. We ean choose this position in G41 ways, all equally likey. Exactly one of these choices puts by before all the g's. So P(t appears before rst good element) = 1/(G + 1). Conclude: By B#G41 N41 GH Gti“ G4i E(x 1] -{£(X - 1). Now E(X) = (XK -1) +1 b) We have Var(X) = Var(X ~ 1] E[(X-1)] = E[(h the tet fay) Leuy+OD,, es 1 Be gig + BB - NEI) Is and, by symmetry and the fact that J)? Elita) = P(by and bs both appear before the first red card) 70 10. uu. 12, 1s. Section 3.6 (Reasoning is imilar to above: Consider the positions of bi and by relative to the g's. There are G-+2 positions to fil, and we can choose the pait of positions for b: and by in (2/2) ways. Only the pair (1,2) gives the event we want.) So Var(X: wer mee Dey 142 alee aa B_ [G?74+3642+42BG +28 -2G~2~ BG-28 oat (G4 IGF1) 3 ; (Steees ) = BOW +1) Gai \(GFyG+n) ~ G+ yIE+7) and SPOD= VEE G+D No Solution 8) Pleisess, 0) =1/(0) if 21 ++ btm = 9 and 0 otherwise b) noc) yes 2) There are (M) possible ways to draw a numbers from the set (I-+-N). Thus the process of taking a simple unordered random samplé from this set gives the chance of any given subset of numbers as "Y=" Tf we consider the process of the exhaustive sample, where the Xi 's are the draws in order, the chance of getting any particular ordering i, for example, Ne We P(X, = BL Xa = G,X5 = By Xw = B)= where the ordering of the numerators may be different but the product must be equal to which is the same as it was for the sample of size n from {1-+- NN) b) This is essentially done in the previous problem; (Way PUT = tons au Wi ©) The object here is to count the number of sam elements bigger than {This mumbe is (() have i 1 elements les than 1, and n — ‘) and the total number of samples is (*), 0 the chance is 4) Given D, writing out the probability of any given sequence of Us, Uja),---,Uin) gives the same formula as that in pact b). P(D) wAlSgtl so P(D) > (Ha2)" — Las N= 00 for fixed 2) uniform over ordered (n + 1)—tuples of non-negative integers that sum to 4) BWW.) =(N = mdatz, ECT) = (WN —n) gt $1), For N= 82 and n E(W,) =96, E(T,) = 10.6, E(Ts) = 21.2, E(Ts) = 31.8, E(T,) = 42.4 2) PU, + Wa in the jal ease n= 0 when prob. is I) n OS CEN (or (= N, prob is 0 except Section 3.6 {) Because the W; are exchangable, Dry has the same distribution as N ~ (Wi + Wa) ~ 2. 4) = P(W1 + Wasi = N ~2~d). Now use e). (Tn) — ET) 1 14, No Solution 15. 8) We through We are spacing between two aces (W; and Wns ate not). To get two consecutive Wa must be 0. The expression in the problem = 1 — P(not all 0, f= 2..m) b) Put dowa IN places in a ow, and color of them blue, the rest ted. ("5") is the number of ways to place the good elements avoiding all the blue places. There is a 1-T correspondence between such choices and ways to make (W > & for each i) =ty=1. Sot mh 16. No Solution 12 Chapter 3: Review Chapter 3: Review 1.) 1-(5/9) H 10/6 eas a) ME) = GO ©) } (1 — Pleame number of sixes in first five rolls as in second five rolls)) = Die (Dosorero+P} 2 a P(first 6 before tenth roll) = P(at least one 6 in first 9 rolls) = 1 — (5/6)? = 806194 » Plthizd 6 on tenth rol!) = P(iwo sixes on fist nine rolls, 6 on tenth roll) = (QJemrenren = 046507 a Pithrce 6's in first ten rolls | sx 6's in fist twenty rolls) = P(three 6's in frst ten rolls, three sixes in last ten rolls)/ (six 6's in frst twenty rolls) _ Carey sror" (2) a7 67/6)" _ (8)? “ COTE E/E* ~ey amen 4) Want the expectation of the sum of six geometric (1/6) random variables, each of which has ‘expectation 6, So answer: 36. ) Coupon collector's problem: the required number of rolls is 1 plus a geometric (5/6) plus a geometric (4/6) plus etc. up to a geometric (1/6), so the expectation is 1 + (6/5) + (6/4) + (6/3) + (6/2) + (6/1) = 14.7. 8. X:max(Ds, D2) ¥ = min(D1, Da) rolyt Bah " PUK = 2) = PX <2) P(X $21) = (2)*— (24 ie ( rorcazcae| B for & lorisy 25) = 3%2,, (°°)(9/19)' (10/19). Note that the gambler’s capital, in dollars, after 50 plays is 100 + 10¥ + (—10)(50 ~ ¥) = 2207 — 400. So P(not in debt after 80 plays) = P(20Y—400 > 0) = PLY > 20 C2) apr9yc “199%, b) E(capital) = E(20¥ ~ 400) = 206(¥) ~ 400 = 20(80)(9/19) ~ 400 = 1400/19 ; Var(capital) = Var(20Y ~ 400) = 400Var(¥) = 400/50W/9/191/10/19) = 1800000/1 ©) Use the norial approximation to the binomial: £(V") = 23.7, and SD(Y) = 3.53, so PUY > 25) = 1 ~ 6 (28532 (51) = 305 PUY > 20) = 1 ~ 6 (2255242) = 1 — 6(—91) = 8186 Ta) P(2 4 heads in 5 tosses)= = 0.1875 14 9, 10. cr 12, 13, = Note that E(X?) = Chapter 3: Review b) P(0,1, or 2 heads in § tosses)= 36 = 0.8 ) —poss_vals | 0 2 6) probs vas a) Let X = Dit---+Ds, where Ds is the numberof balls until the frst black ball, Dz is the number of balls drawn after the first black watil the second black ball, and so on. When drawing with replacement, the D; are independent geometric( -t) random variables, so X has the negative binomial(s, 38>) distribution on 8, + 1,... kaw Boyt yw yt = (21) Ga) Ge)" we ) When drawing without replacement, is possible to draw the &th black ball on any draw feom bottw. P(X = P(X = k) = P(b-t blacks in fist k-t draws) X P(kth draw is black | b-1 blacks in frst k-t draws) (62s) (020) E=He+ Let X.Y be the numbers rolled from the two doubling cubes, and let U,V be the numbers rolled from (wo ordinary dice. Then (log, X,loga ¥) has the same distribution as (U,V) . 8) P(XY < 100) = P(log, XY < log, 100) = PU + < 6.68) = $1 PU +V = 1)) = 5/12. b) PUXY < 200) = PU + V < 7.64) = 5/1241/6 = 7/12. €) E(X) = 21, s0 by independence E(XY) = B(X)E(Y) = 441 4) E(X?) = 910, so Var(XY) = BU(XY)"|—(B(XY)P = 633619 and SD(XY) = 196. a) Ax (1A) tig 'b) number of matches = JT7_, 1(match occurs at place j) E(mumber of matches) = n x Assume X has Poisson(2) distribution. P(X < 2) = 406, P(X = 2) = 271, P(X > 2) = 323. Following the hint, Props (Lt ptt pea Pom(ltat tp, hence and and finally ‘arX + (E(X)P = 0? 412, same for B(Y)?. By independence we have EUXY)') = £00) B() = (0? #2? E(XY) = (EXE) = 0 and s0 Var(X¥) = E(XY)) ~ (E(X¥)P = (0? +2)? = (a7)? = 15 P(e? + 2n?) Chapter 3: Review 14. a) This is 1 minus the chance that current flows along none of the lines. The chance that eutrent does not flow on any particular line is the chance that at least one of the switches on that line doesn’t work. So answer: 1 ~~ pi) 72) — 93) ~ PI), b) Let X be the number of working switches, ‘Then X is the sum of 10 indicators, corre: onding to whether each switch works or doesn’t work. All the indicators are independent. 2(X) = pi + 2e2 + 3p0 + APH, Ver(X) = pias + pata + 3930s + 40048, and SD(X) isthe square root ofthis. Alternatively, the number of switches working in line é has the binomial distribution with parameters i and pi, independently of all other lines. X is the sum of these ten binomials. Formulae for E(X) and Var(X) can be read off the binomial formulae. 15. a) Binomial (100, 1/38). _b) Poisson (100/38) ) Negative binomial (3, 1/38) shifted to (3,4, 4) 3x38 to fia) arate esis een eo te | 3 PD a [a2 [04 [a2 [09 [2 [08 4 ‘The distribution of D on {1,2,...,9} is symmetric about §, so A(D|D > 0) =5. Therefore E(D) = E(D|D = 0)P(D = 0) + E(DID > 0)P(D > 0) = 0 +5 x (73/100) = 3.65. fone six and all different ene al diferent) $0 the required probability is 1/6 17. P(one six | all different) a) Numerator = (y2,) 4; denominator b) The numerator is YE Plone sic and all dierent |W PN the denominator is JP Pal diferent |W = a) PW = so the required probability is 2604, 18,4) The return (in cents) from the game is Xahth+--+ho, where those of all previous cards dealt 0 otherwise { 1 if the number on deal jis greater than ‘That is, Js is the indicator of the event that a record value occurs at deal i (a record is considered to have occured at the first deal), and X simply counts the number of records seen. A counting where y = 57721 (Euler's constant). (This approximation was used in the Collector's Problem of Example 3.4.5]. So 5.19 cents is the fair price to pay in advance, 76 Chapter 3: Review ») The gain (in cents) from 25 plays is S=XtXa test Xs, where the X's are independent copies of X. The /i’s are independent so that Vor(x) = Severn) = Soi (1-2) esse Zaast <2 spe) = 186 Using the normal approximation, obtain P(S > 26 x 10) = 1 - (12.8) ~9. 19. ¥ : numberof failures before fist success, P(Y > 9) + Poison (a) independent of ¥ Y pre y= Soe 6065 if p= 1/2, 20.) Let p= 1/242 (where x could be negative). So 1 ~p=1/2~z, and P17) = (1/2 + 2)(1/2—2) = 1h 2" 1/4 since 2? 2 0. b) The margin of error in the estimate is V/p(1— p/m, where n is the sample size, and p is the proportion of part time employed students. This assumes sampling without replacement. For sampling with replacement the margin of ertor would be smaller, due to the correction factor. No matter what pis, D. f4 a SVin by patt a), so we should take the smallest n so that /I/dm <.05, i., m = 100, PC 21. X has negative binomial (1, p) distribution, Y has negative binomial (2, p) distribution, and X and ¥ ace independent. Now suppose a coin having probability p of landing heads is tossed repeatedly and independently. Let X’ denote the number of tails observed until the first head is observed; let ¥ denote the additional number of failures until the third head. Then (X,Y) has the same distribution as does (X',¥"). So Z has the same distribution as X'4Y" = the number of tails eeen until the third head: negative binomial distribution on { 0, 1,1.) with parameters r= 3 and p. 22. Suppose the daily demand X has Poisson distribution with parameter A> 0. (Here A = 100.) Suppose the newsboy buys m (constant) papers per day, n > 1. Then min(X,n) papers are sold in a day; the daily profit x» in dollars is 1 Hq =f min(X,n) — and the long run average profit per day is Bro] = GElmin(X,n)] CClalmn: For each m= 12,3, (1) Blmwin(X,n)] = AP(X et) (2)B{min(X,n)] = > PUX > ki) Part (2) holds no matter what dsteibution X has. 7 Chapter 3: Review Bfoin(X,n)] = Sian) P(X = 4) Yess Dee +nP(X 2041) ay oy i Poy Perey =APIX Sn) +aP(X Dn 41). (2) Use the tail sur formula for expectatios Snow Phmin(X,n)] = D> Ploin(X,n) > 4) a) Say A= 100 and the newsboy buys profit (in dollars) is 100 papers per day. ‘Then his long run average daily Erie) eee 1 $ £lmin(X,100)] ~ 1-100 [100P(X < 99) + 100P(X > 101)] ~ 10 by Claim (1) 5{1 — P(X = 100)] ~ 10 5 — 25P(X = 100) = 14 00 (100)! 70 1 ee © 0.04 by Stitling’s approximation. Tix 100 ened P(X = 100) b) ICA = 100 and the newsboy buys m papers per day, then his long run average profit per day is Fira eee 1 Elonin( X99] ~ J jim by Claim (2) Elrwen—) Since the function k —» P(X > E) ~ $5 is decreasing, is postive at = 1, and is negative as k— oo, it follows that Eta) is maximized at n* = the largest k such that P(X > k) ~ a > 0, i.e, n* = 102. Thus the newsboy should buy 102 papers per day in order to maximize his daily profit ~ assuming that demand has Poisson(100) distribution. 28. a) The drawing process can be described in another way as follows: think of the box as initially containing 2n half toothpicks in m pairs. ‘Then half toothpicks are simply being drawn at random without replacement. The problem is to find the distribution of HY, the number of halves remaining after the last pair is broken. By the symmetry of sampling without replacement, H has the same distribution as H", where H’ is the number of draws preceding (.e., not including) the first time that the remaining half of some toothpick is drawn. This may be clearer by an analogy. Think of 2n cards in a deck, 2 of each of m colors, and imagine dealing cards one by fone off the top of the deck. Then If corresponds to the number of cards remaining in the deck after each color has been seen at least once, If the cards had been dealt fram the bottom of the eck, this number would have been the number (corresponding to H1') of cards dealt preceding 78 24. Chapter 3: Review the first card having a color already seen. Since the order in which we deal the cards makes no difference, it follows that H and H' have the same distribution. Thus for 1 k): For each k= 1,2, pe In Inat n+l we have PU > b) = PCH! > k) = P(first & colors are all different) 2m Qn=2 In —2{k— nat e-(F-1) b) Asn 00, P(E >) = PH 2 VIR) Rex(-77/2). ‘This limiting distribution of H/ Vis called the Rayleigh distribution (see Section 5.3) ) This approximation suggests BUH) = SPH > h)~ Soe (-E) = vn ee ser avin [ies = Vine ‘That is to say ° E(H) y Vem as n 00. d) Expect about YT00-m = V3T4 = 17 ot so. a) HP(X=3,¥=2.7=)=1/8, P(X =2,Y =1,2=3)= 1/3, Px 2= 1/3, then P(X > ¥) = P(Y > 2) = P(Z > X) = 2/3, b) Let p= min( P(X > Y), PCY > 2), P(Z > X)}. Then PS EIP(X > ¥) + PUY > 2)-4 P(Z > XI} EUI(X > ¥) + HY > 2) +1(Z > X)) $ 2/3, since I(X > Y) + 1(Y > Z)+1(Z > X) $2. €) Say each voter assigns each candidate a numerical score 1, 2, of 3, 3 for the most preferred candidate, 1 for the least preferred. Pick a voter at random, and let X be the score assigned to A.Y the score of B, Z the score of C for this voter. Then P(X > ¥) ="Proportion of voters who prefer A to B®’, ete. Soa population of 3 voters with preferences as in a) above would make each of these propertions 2/3. 79 Chapter 3: Review 20. ar. 28. a) Let P(x, PU P(X = 1, Xa ‘Then all the p bilities are (n—1)/n, and the minimum probat (a= 1)/m. @) We have P(X > Y1 PY > Z) PZ > X)=1 =p ‘These will all equal gif — pi =9, pa (dpe wad 1— (1 — pa)p2 = 9, that is, if 1 ean the equation of the golden mean. (L. Dubins) of r1{[2 [sla ye YT 97a | 12/36 | 10736 | 4736 | 1736 by 10/3 o it o om rains o={$ § 2 it a) We wish to find 2 such that (.99)" b) This is essentially a negative binomial with the roles of success and failure reversed. Let Fi be the number of failures (successful honks) before the 4th success (failure to honk). Then E(Fi) = $2582) = 396, but for our problem we must also add 3 non-honks to get an answer of 399. £12) = J s005¢“p + 915004 ep = 1009+ 24p + 3a?p + 4a + Setz) +500 >a p 440) zap cee . rfl = 100(p + 2¢p +3479 + 49°p +'S0"p) + (5009") + (409° >) = 186.43 + 156.62 = 343.047 3) Ww > Land yA then PW, = WY, 9) = PU Ave Xen €AXe = 9) = PUL ¢ A) (Xun ¢ ADP(Xe = 9) == P(A Pil) Pi(y)/P\(A) and sum over y to get PW, ue in this way to show that for all k > 1 4) = POW, = wi) PO Sum over w to get PCY Con! (= PEAT PA, PW, 20. Ys = We = HY 1)-- P(Wa = wa) P(e = ve) (Idea: Express the event on the let in terms of the independent variables X1, X2y..-- ) 80 29. ¢) uniform on (0,1, Chapter 3: Review b) By the weak law of large numbers: Asm — 00, Pe LK: €AB)/n PAB) Lie RE A)/n PAY nm) d)noyes e) sts) sty Di MX € AB) = PA(BIA). 80. No Solution 31. No Solution 32. No Solution a) edhe: Pl co in agian) = Pe a lla a began ed ing) Lay 5 Pick al F edi red in bag)PU red in bag) - Qy2y" 2 (0/2 FO) * FEA where X has binomial (n,1/2) disteibution. 2b) 1Ck= 1: then B(X) = n/2, 50 P(all n red in bag|pick red] k= 2: then E(X2) = Var(X) + [E(X)P = $+ 02, 20 Pall n red in bag|pick both red) ay ) the & balls are drawn without replacement from the bag. follow the argument in ), but replace 2 with a with fae apy (oe P(all n red in baghpick all k red Dao BEGG EM where again X has binomial (n,1/2) distribution. On the other hand, if the & balls are drawn without replacement from the bag, then the number of red balls seen among the & has binomial (&,1/2) distribution: it's as if we drew the k balls ditectly (rom the very large collection. So = Plbick all & redlalln red in bag) P(all w ced in bag) _ 1-(1/2)" Pall m ced in baglpick all & red} PG ay a a) Ik = 3: Since (X)s = X(X = 1K = 2) =X — 3x7 4 2x, compute FU?) = 619649607) 2614) = S49 ( E42) 2 (2) «sD and the result in a) simplifies to 1 need in all 3 red) = . Palm edn bagick al 38) = 81 Chapter 3: Review {) Let p be the proportion of red balls in the original collection of balls. Repeat the argument in 6) to get P(all n red in baglpick all & red) = a BEC) (where X has binomial (n,9) distribution) and (all m red in baglpick all bred) = Fe. ‘Therefore E(X)x = (n)ap* Check: This formula gives B(X) = E(X)s = (n)ip = np and E[X(X ~ 1)] = E(X)2 = (nap? = s(n — I)p? from which i flows Var(X) = BLX(X ~ 1)] + £(X) — [BOOP = p(t ~ 2). 34, No Solution 35, No Solution 36, a) The kth binomial moment by is just given by » o = 2h hy a(S) pF an Co? = aoe =e be ar. a) (2) 4h 38. a) Using the Ath binomial moment results, we observe that the kth factorial moment is just kt times the th binomial moment. Let Ai be the event that the ith letter gets the correct address, ‘Then Mn = S77, As and the kth factorial moment is just inp)? = np ~ np? = np(l—P) Sela) =H SD P(A Aig Aig) =H 1b) Let X be Poisson(1), then the th binomial moment is BAX) = BU XX =). (XE) Stone xg ener Se xyes Chapter 3: Review <¢) Use the expression of Exercise 3.4.22 for ordinary moments in terms of factorial moments. ‘Asn — 00, the Ath moment of Mn are equals the kth moment of X for all k n)= (1 Thesetre - prsny=09 «= (1-3) for example 1A = 124, shen eure 92280 0) tab psn age. Ob hat (> (the first n trials yielded all different objects). By comparison with the birthday problem (see index) we obtain P(T>n) (-#) 0-2) (0-27) IC A is large, the right-hand side is approximately e~ “SS” , and this approximation is good over all values of n. Thus if M is large, POT > ny we en and PUT exp x01 2H > ne 2A log 10 ae LEVEE EMTS TO + . For example, if Mf = 1024, then require n = 70. ) We claim that for each real 2, dim P(T < M(log M + 2)) First, a heuristic justification: We have T = maz(T)."3,..Tw) where T, is the number of (Grom the beginning) cequited to see object i. Then each T, has geometric distribution on (1.2...) with parameter 1/M, so PCT, < M(log M +2) =1- P(T. > M(log M +2) M 1H (eB i AL large 84 ee Chapter 3: Review Now if M is large, the random va les Ti, Tay.nsTar ate almost independent; hence P(T < M(log M + 2)) = TS, P(T; < M(log M +2) Now, a more rigorous justification: Let k be a positive integer. We have (> b) = (one of the M objects has not been seen in the firstétrials) = UM, (object { not seen in k trials), By inclusion-exctusion, POT > k) = P {UM (object é not seen in & trials)} Jo Plodiect i not seen in k tials) ~ > Plobject i, j not seen in fsa) tot (-1)M' Plobjects 1, .., M not seen in k trials) = (1) 2) = (a8 8) -Senr (4) 0-4) P<) ay. ) (0 Let x (real) be fixed. Replace & by M(log M +z) and investigate the behavior of the probability as M = oo: PUP M(log M+ 2))= SoU-ay (4 (: Nema Dens. where om, = {omega o M Note that for each j = 2) ety We evaluate the limit as M — 00 of P(T by = P{URE bjt: ent been sen a tsal)} an =e (MP) a)"; Leo (P) O-&) an (42 iF PITS) dew (” )( #): os Ut(log f+) pers Moe + 2) = Sou (MP) (1g) et oo (Hleuristically, we may view the collection of M tickets as consisting of half desirable and half undesirable. Tis the time requited to obtain all the desirable tickets. Since roughly half of our trials result in undesirable tickets, it would take twice as long to obtain all the desirable tickets as it normally would had there been no undesirable tickets at all. In other words, if T" denotes the number of trials needed to obtain a complete collection of M/2 when drawing from the set of M/2 objects, then T is distributed approximately like 27"; hence if Mf is large, then P(r < M(log M+ 2)) = PET < M(log M+ 2)) = Pers Keg +2) Ths fo Mr Per ga)n08 me ne Mog M4225 1024, then require m =: 8700. Xp 4 Xa tant Xuan where Xi = 1 and for i = 2,3,....M, Xi is the additional number of trials (after obtaining the (i — 1)st different object] requited to obtain a new object. Then X, has geometric distribution on (1,2,...} with parameter pi = (M~i+1)/M, i= 1,..M, and the {X,} are mutually independent. Hence For example, if M mp Mp MB M Br) = BH LeeLee 1,4 -u(4 varen = Se vencxy=5- (4-4) wt gy) ~ Mloe2 a a = oe, aa and up eee Sota) EG) ap ( at 1 1a a = Ge arp aie)“ eat +H) ~0 86 (og2)) = (1 ~tog2) a = 0. Chapter 3: Review contrast this with the full set of variables X;, X2j.uXae, where the latter variables have very high expectation). Hence T Mlog2 E() Vitt=t053) ~ S007 has'approximately normal (0,1) distribution. In fact, it can be shown that for all =: dha, ? (05 Mog + VNTR) sis. (Fatty =) ot Set ®(2) 9 eer 1.282. Then for M large, we have PUT nx Mlog 2 + 1.282V/M(1 — log), For example, if M = 1024, then require n = 730. 87 Chapter 3: Review 88 ©1993 Springer-Verlag New York, Inc. Section 4.1 2 a) The desired probability is the ares under the standard normal density y = Lve-""/? between © and z = 0.001. The density is very nearly constant over this interval, so the desired probability is approximately 0.001 x width of rectangle)(approx height of rectangle = (width of rectangle)(approx height o Te * ve b) Similarly the desired probability is approximately 1 xe = 0.001 x = .000242. 1 To00Vaxe ») Q B(x?) Thus Var(X) = (X?) — (BUX)? = a) Since f is a probability density, f°, f(z)dz = 1, so eff x ~2i¢2 =e($- 2) | gmene b) PUX <4) = fd a(t ~ 2)de = 32? - 22" Remark. Once you note that f(z) is symmetric about £ (draw a picture), the answer is clear without calculation, = fS6z(~ 2dr 232? -[ ©) PKS 4) By the diference rule of probabilities, PUR 1) = 2% 5S aaaede i. ardgede = co (see Example 3} 4d) No, because E(IX!) = J%, aatyed= 6. a) P= P(X $0) = P(A5# <4) = 6 (—8) ee — 8 = — 4303; Ba POX <1) = P(SS# < 158) = (SH) me Se = a303. Subtract: b= 8606 <=> 9 = 1.162. Hence w= 43030 symmetry, x must be located halfway between 0 and 1 If P(X <1) =} then (454) =} exo 15# = 6742, This implies that 1 = .6742 + 4303 = 1.1045, and o = 9054, and = 3896. 5. Or you may easily see that, by 7. Let X be the height of an individual distribution of X is approximately normal (u,27), with icked at random from this population. We kuow that the 0 (inches), and P(X > 72) = 1. That 9 = P(X S12) = P(A (where Z has standard nor that the height of an indivi Bg) = P(Z <3) =#(3) distribution). Hence 2/o lual picked at random exceeds 74 inches is P(X > 14) = P (42% > MSH) = P(Z > $) = P(Z > 2.56) = 1 — (2.56) = 0082, In a group of 100, the number of individuals who are over 74 inches tall therefore has binomial(100, (0052) distribution. By the Poisson approximation, the chance that there are 2 or more such individ ‘als is approximately (with ye = 100 x 0052 = 52) (1 + 52) = 096. 18 from the normal table, and the chance 1-(er peta) ete 90 8. 10. ne 12, aa. Section 4.1 o(2222) -9(B8=2) — sau TT aT Since the Ceatral Limit Theorem says that the average of a large number of measurements will bbe normal, it is not necessary for the measurements themselves to be normal, although if the measurements are extremely skewed then 100 may not be a large enough number. ‘The distribution of S is approximately normal with mean 2 and variance 4 x Pts. 23)e1-0 (aeh) =1- 00.29) =1 9582 = 0418 a) (22% (6.45) — (1.29) = 0.0985 Da oats ©) 8(1.28) = 0.90, 20 the weight = 9.7800 + (1.28 x 0.0031) = 9.7840 gm. a) (0.43) = 3,50 0.43 and o = 0.2325. 029 oa w 0 (-222,) ~ 2 (- 82%) = 2086)~1 = 0102 6) (0.618) = 012s the diameter e015 mips of ow the meas. 1~ (0515 x0.2325) = 0.84 4) Range of X: (-2,2] -2<¢2<2, then Sl e)de = PX € dz) = AGED = 32 —[el)ds, [)/4. Elsewhere f(z) = 0 4) M22 <0, then Se)dz = P(X € de) = Gases Wo ge <1, then flaps = Ming Elsewhere f(z) = ) Range of X :[-1,2} ‘The density willbe linear on (—1,0}, constant on (0,1), linear on (1,2 4(Q2+z)dr ‘Area = 2h ‘To make area = 1, h must satisfy 2h = 1, ork = $ Let X denote the length ofa rod produced. Note that ao the probabiee of ntereat roman the sane under 8 Y Tear change of cele So, without los of ge Seng pene ey ant a ie whe gue a) PUX] > .75) = 1/16 from the figure 0°" 05 00 05 10 91 Section 4.1 b) PUXI < SIX] <.75) If the customer buys n rods, the number N of rods which meet his specifications (2--uming, independence of rod lengths) has binomial (n, .8) distribution. Need m such that P(N > 100) > .95 =» P(N < 100) < 05. Now by the normal approximation to the binomial, Pun < 100)~ 9 (Sestaheh) So solve *(9aR) so 14. (continued from Exercise 13) Again, the probabilities remain the same under a linear change of scale. Let ¥ be the length of a rod produced by the current manufacturing process. To get the mean and standard deviation of ¥, note that the density of X is given by S165 ep n> 134 mare { Ol Bes Var(X) ‘Therefore E(Y) = B(A) = 0 by symmets IQ — 2 )de= b So ¥ has normal (0, 1/6) distribution. Var(¥) 2X7) = f*, 2701 ~ tele a) PUY| > 18) = 2x [1 6(V6 x.75)] b) PUY| <5) =2x (V6 x.5) - therefore P(IY| < 5II¥1< 75) So you should choose the manufacturer of this exercise, since each rod that you buy would have a greater chance of meeting your specifications (although not by much). 15. a) (0.1/2) b) exf(z) = 20( VE) — 1c) 02) = (erl(2/V3) + 1)/2 92 Section 4.2 Section 4.2 1. Let X denote the lifetime ofan atom. Then X has exponential istibution with rate A= log? a) P(X > 5) =e = (1/2)* = 1/32. B) Find ¢ (years) such that Ps yen ee aa oe t= BE ae c) Assuming that the lifetimes of atoms are independent, the number Ny of atoms remaining after {years has binomial (1024, e™) distribution. So find ¢ such that log 1024 zy 4) Njo has binomial (1024, 1/1024) distribution, which is approximately Poisson (1). So by the Poisson approximation, B(N,) = 1 <= 1024e™* ets 10. P(N = 0) & e~! = .3679. 2. a) Let X denote the lifetime of an atom, then X is exponentially distributed with rate A = !982 log per centary. Now we have 10° atoms, and we wish to find the time such that we expect 1 ‘out of 10"* atoms to survive, which we can do by solving P(X >t) = jqrr for ¢. We know that P(X > t) =e and so ~(log 4)(t) = ~18log 10 and finally ¢ = 189832 fog b) This is equivalent to saying that there is about a 50% chance that no atoms are left. Let Ne be the number of atoms left at time t, and we wish to find f such that P(N, = 0) =.5. Note that Ne will be a binomial (n,p) where n = 10%, Note further that this will be approximately. ‘2 Poisson with x = np. Thus we observe that 29.9 s 30 centuries. "rnp? P(N, =0) = SE land so np = log? and finally p = !°82, We also know that p = .S' since p is the probability {hat given atom will survive fort centuries, and #0 log 2 er Tog 3 So after 67 centuries there is about a $0% chance that no atoms are left, 3. Let T be the time until the next earthquake, then we have in general that P(T'< ¢) =1— 4) The probability of an earthquake in the next year is 1 — ¢ ¥) Similarly, P(T <5) ams. <) PIT <2) : 4) P(r <0 6321. =e 9 = 0.99995 4. Let W be the lifetime of a component. Ther W has exponential distribution with rate 2 10. 8) POW > 20) = <2 30.135 b) The median lifetime m satisfies 1/2 = PW > m)= 93 Section 4.2 6 ¢) SPW) =1/=10 4) If X denotes the average lifetime of 100 independent components, then E(X) = 10 and SD(X) = 10/V/100 = 1 so by the sormal approximation P(X > 11) = = P(X $1) 1 9(1) = 1586 e) Let W and We denote the lifetimes of the first and second components respectively. Then P(W, + Wa > 22) = P(N <2) = 67? 42.267? = 35457 where W has Poisson (224 = a) PW. £2) <1 PW, > 2) = 1? = 86, b) PUT <5) = PN(0,5] > 4) =1~ P(N(0,5] $3) = 1 e-*(1 +5-425/2 + 128/6) = 73 €) E(Ta) = BW, + Wa + Ws + Wa) = £(Wi) + Ba) + BW) + BW) = tLe Note. Ty has gamma (4, 1) distribution, .2) distribution. Let Nz be the number of hits during the first 2 minutes, and Ny be the number of hits during the first 4 minutes, Then Nz has Poisson (2) distribution and Ny has Poisson (4) distribution, and. POST, <4) = P(T > 2)- PT > 4) "Wa $2) — P(Ne $2) “tent SE) a se? net most To compute the density f of X, argue infinitesimally: Heat = PUK € (tt 4t)) bs)P(X (ide § + fus(O)dt-F J# PC € (41-4 401A = shy) PCA = hs) where fy, is the density of an exponential random variable Yi having rate 1/100, and fy, is the density of an exponential random variable ¥; having rate 1/200. Hence HO) = 5Fl) + Ff (Ot> 0), a) P(X > 200) = $P(% > 200) + $P(¥ > 200) b) BCX) = $E(%) + F(a) = 3 x 100-4 § x 200 6) E(X?) = $E(Y2) + 2B(VP) = $2 x 100? + 2 x 2 x 200? = 6 x 1007. Therefore Var(X) = E(X?) ~ (E(X)P = 29922. fer200/i00 4, 2¢°200/200 = 29 ter JO er tetde = HFK) a) (et 1) = fo eretade = b) Note that £(1) = fo etde = So P.-1) = AP(e) = rl = NEGF = 1) oo HIE) = a ©) BUT) = for he'd = P(n +1) = Var(P) = £(1) (ECP =2-1 =1 = SD(T) = 1 4) POT > u) = PUT > afd) = 6-0 = 6, so AT has exponential (1) distribution, and E(QT)"] = nt =e BT) = nr". SD(AT) = 1 => SD(T) = 1). 94 10. ue 32. Section 4.2 a) Let X = int (T). X takes values in (0,1,2,...) and P(X = 8) = PEST 0 such that pm = 1—e-3/™ ‘e* and consider P(T > 1)); next argue that for each rational 1 > 0 we have e-* finally argue that this last must hold for all real ¢ by using the fact that Bak git hams (Use 'Hopita’s rule, or series expansions). ince Tm ST $ Tm +, we have E(Tm) < E(T) < (Tn) + i. Let m + co to see E(T) Similarly argue that £(7%) ~ } as m— oo, and therefore that £(7?) = 3. a) Want to show P(T > t) =e, Write A = 10~* seconds, and consider t = nA forn = By assumption, 1,2, PUTS (n+ 1)O(T> nd) = AO for all n=0,1,2,...5 equivalently PUL > (n+ 1)AIT > na) AO for all n=0,1,2,... Therefore PIT > 0)=1, PT > A) = PUT > 0)P(T > AIT > 0) = 1x (1-24) = 1-24, PUT > 2A) = P(T > A)P(T > 2AIT > A) = (1 — AA)(A— AA) = (1-4)? In general, P(T > nd) 1 ~ AQ)". Use the approximation 1 ~ AA = e~** to conclude PUT > na) = (e794) = Pur (=n: P(T >t) ze b) PQ 0, s0 fa is maximized at 0. Ifr > 1, then the derivative is zero at 1" = (r ~1)/2, is positive to the left oft", and is negative to its right. So ¢* yields a local maximum for the density. But the density is zero at ¢= 0 and tends to zero as t — 00; hence the density achieves its overall maximum at t* If <1, then the density blows up to oo as ¢ approaches zera. b) For k=0,1,2,... we have erty= [ea 1 re+h) Fa Ta Ty [Cermetacn SMe Hence BT) = 5 95 Section 4.2 1s. 14. 16. a. BCT?) = de et = ope Var(T) = i — (3) = a SD(T) = ¥. 2) Bsifmate 4 = 1/20 = 5% per day b) Na has binomial (10, 000, e~*?*) distribution. Therefore E(Na) = 10, 000e~ 4", $D(Na) = 100/e-7°(1 — e-47™), From this calculate Elio) = 6065: SDM) B(N2o) = 3679; SD(Na0) EUNs0) = 2081; SD(Non) Option b) is correct: The probability that a component fails in its first day of use is 1—e~"/#®, which is approximately 1/20 = 8%, because | —e"* =z as z —+ O; and is less than 5% because ef 31-2 for all x (See Appendix III) Exact value is 4.877...%. 2) 60 days 2) 40 days ©) Peat > 60) = Plat most 3 flares in 60 days) = e-9(-434 8 42) since the number of filres in 60 days has Poisson (.05 x 60) distribution. 13e~* = 64723, Say a total of components will do. Since P(Thetat > 60) = P(Neo <& ~ 1), we require P(Neo Sk ~1) 20.8 By trial and error, we find P(Neo < 5) % 0.91608, so total of six components (five spares) will do, Redoing the satellite problem: 2) 80 days b) 20/7 dave ) Guess the answer to c) should be larger, because now 60 days is more standard deviations below the mean of 80 days. In fact, Tort has the same distribution as the sum of 8 independent exponential (.1) variables, 50 P(Tietat 2 60) = P(Nep < 8) = .T44 where now Neo has Poisson (6) distr Redoing the preceding problem: Since P(Neo < 10) = -9161, it follows that four spare components will do, 96 Section 4.3 Section 4.3 La) P(r $)=1- PT >) =1-G). b) Pla ST <8) = P(T > a) - P(T > 8) = Gla) — G(s). (Since T is continuous, P(T > a) equals, P(T > a) equals G(a).) 2, Suppose T has constant hazard rate: Say A(t) = ¢ for all ¢ > 0. Use (7) to get Gat > 0. ‘Then the density of T is, by (5), t>0, £G() _ 1 = #90 = 0 T has exponential distribution with rate Conversely, if T has exponential distribution with rate A, then for each t > 0: F(t) = eG) = PIT > t) = EAL) = al 4 Qs oo (- fae) =e (— frou") xp(-M*) = G0). (i) = (Gj: differentiate G with respect to t: dna a e-awe -taw = date = #0. (i) = Gi: =e 26 (ii) & () = =Aatt? = (0) 5. Let b> 0. a) E(T*) = fr ets(ayde = JO ratte M dt = f(g) ertde = acMe [> seeds = tenant ») By (a), ary = ater (241); erasers | vence vere =a {r (241) —[r (2 +)]"} 6. We have A(t) = 1/20 0-5 ¢< 10, and A) = 1/10: €> 10 97 Section 4.3 2) PCT > 18) = G(18) = exp (— f° (wee) = exe {-U° 1/2040 + f2afoaw’ 2670. by 1010 then GC) = exp (~fu/20.4) = i> 10 then G(t) = exp {-Lf°(1/20)au + fi,(1/10)du)} = «WCE ED, 10 os 00 — o 3 10 5 20 2 af hei o fo" e“*do = 7. a) Integrate by parts the telation E(T*) = f° Ps(0at b) E(T*) is 400. So SD(T) is /TOT= TOUR = 9.268, ©) UT denotes the average lifetime of 100 components, then E(T) = E(T;) = 17.7245 as (7) = SD(T,)/100 = 0.9265 so by the normal approxims P(T > 20) = 1 ~ (2.456) = 0.007 8. a) Only for a > 0,6 > 0, and either a > 0 orb > 0. ¥) GW) =exp~ (#7 +61) 98 Section 4.3 24) ©) 10) = (at + b)exp— 4) £(D) = JO God = pep (sf +01) ae ee [exp —3 (t+ £)* dt = Pe fo eA ( = VE (1 a(6/ Va) ¢) Compute E(T*) using £ (31? 407) = f° (f+) (ate exp then use Var(T) = E(T*) ~ (£(T)P. for uew"du =; 9. a) Put (5) in (6) to get Frc) penne) aera 0 = GH = aay gee = Heat. 2) fe Aw)du = fff los Glu)du = —1o8 G(0) = wsaty == [epee = ay =e0{- [ rove} 10, a,b) Smaller, since foreach 4,¢ positive: = PUP > ste T>9) PIT> s +e > 9) = SESE _PIT>s+0) “PT > s} nar (- [7 0s) Jorn (- x02) = en(- [xe) =e (- ['xe+ne) sew(- [aere) since ais increasing = PT > 9, €) ICA is decreasing, then the inequality reverses, 99 Section 4.4 Section 4.4 1 2, Exponential (A/c) Use the linear change of variable formula. If T has gamma (r,) distribution, with density fralt) = a then the density of eT (¢ > 0) is Ser(u) = Lfealule So eT has gamma(r, A/e) distribution. Apply this twice to see that T has gamma (r,A) distribution iff XT has gamma (r,1) distribution ‘The range of ¥ = U? is (0,1). The function y = u? is strictly incteasing and has derivative #¢ = 2u for w € (0,1). So by the one to one change of variable formula for densities, the density of Ys fru) = fel) /| ‘The density of X is fx(z) = 1/2 for x € (1,1), so by Example 5 the density of ¥ is Lely) + fxl-vi) 344 0 fro) RO = ES = pve os Note: This distribution is called the beta (1/2, 1) distribution 5. The range of ¥ = X? is (0,4). The density of X is fx(=) = 1/3 for = € (1,2), 30 by Example 5 the density of ¥ is 7 Sxl) + Sx(-VO. We 4, Sela) = ify (0.1) ten this imple to 444 = 5 IE (14) the his sims to $82 = gh. forthe heigl 0 6. Notice that ¥ = tan®, Its range is (00,00). The function y = tan is strictly increasing with derivative sec? ¢ for ¢ € (~x/2,/2). So by the one to one change of variable formula for densities, the density of ¥ (8) of ls | 1 we ecm feo ‘The Cauchy distribution is symme ee fr(u) = fry). ‘The expectation of a Cauchy random variable is undefined since I. lyfr(v)]dy ={ =54 ces) 7 ome 2f ai 0 the required integral docs not converge absolutely. 100 Section 4.4 7. IU has uniform (0,1) distribution, then rU has uniform (0, x) distribution, so xU ~/2 has uniform (-.}) distribution. (See Example 1.) Now apply Exercise 6. 8. a) Apply the many-to-one change of variable formula for densities: If z = 9(y) = yyy, then the density f2(2) of Z = o(¥) is given by kea= sn / (waned = > reo f\=29/0-487F1 = | ) 4 ar 1 Vel) Soe=i/r,a=-1/2, 8 = -1/2. b) mesa [a Now if we let # = sin?(u) then ds = 2sin(u)cos(u)du and we have 1 [SRV asin(u) cos(uleu pee ayn) [lade 0 ee 7 a 2sin(u) coa(u)du 7h Vain (a) cos*(u) Poe ae ae Nore epee) Sue store ) . d) ~ 82)= © (wr) 101 Section 4.4 (substitute y = tan8); so Var(Z) = 1/8. 9. Let a> 0. a) The range of ¥ = T* is (0,00). The function y = ¢* is strictly increasing and has derivative # for {> 0. So by the one to one change of variable formula for densities, the density ofY is 10. sein = ta /| | ey > 0. b) By Example 4, X = -A~'log U has exponential (A) distribution. So it suffices to show that XM has Weibull (A, a) distribution. T has range (0,00). The function ¢ = 2'/* is strictly easing, so by the change of variable formula, the density of Tis fl) = won /]|#| mae [Lacie Saat", eo. a) The range of ¥ = [Z| is (0,0). Use the change of variable formula for many to one functions. to get the density of ¥: for y >0 Svs) = YD fel2) = fal) + fa(-y). slaty 269) = Vi b) The range of ¥ = 2? is (0,00). By Example 5, the density of ¥ is, for y > 0, fal Ji) + fev) Wi vi) we Here fz(2) = 6(2) is symmetric, so Sr(v) feo Hagar ¢) The range of Y = 1/2 is decreasing with derivative density of ¥ is 20,0) U(0,c0). If s #0 then the function y = 1/2 is strictly — 2h . So by the one to one change of variable forns.la, the ruoy «sof = oy)? ey #0, Section 4.4 4) The range of ¥ = 1/2? is (0,00). By Example 5, the density of ¥ = (1/2)? is, for y > 0, Suy2l Ji) + fiz 29 fly vo =yerret 11. Duplicate the solution to Problem 1 of Example 7 for a sphere of radius r: vir cosrdd P(@ € d8) = EES 1/2 to x/2 to get “n ee [costes as where A= total area. Integrate from (On: If we accept that the area of a sphere between two parallel planes is proportional to the distance between the planes, then for a sphere of radius r, the surface arca between two parallel planes a distance 4 apart must be 2rrA (take the planes very close together and near the equator). So the total surface area of a sphere of radius r is 2xr + 2r = 4x1? 103 Section 4.5 Section 4.5 Le “oO 1 2 3 4 Multiples of 10, 2 fl | eaeenttee zits Fee) | 1/8 1/2 7/8 1 10 — 00 me 7 ee v0 b) Since P(k) = (1/2)* for k= 1,2,3,..., we haver Ma > 1 then F(z) = Dy<, Pee) = Dy (a/2)* = 1 = (jaye; Ie <1 then F(z) =0, 10> — os — 69 ——___ ee oT a sa SOS 104 Section 4.5 3. a) ¥ has the same distribution as X, s0 Wis otherwise ysol Wwist y>l b) ocr 0, then Faxsaly) = PaX +6 Sy) Ia <0, then Fexqe(s) = P(aX +6 < y) p(x2 #4) = assuming P(e) is «continuous function of = 8. fx <0 then Pele) = [2a eMdy = [2 fetay = Jet 220 then Fez) = Fx(0)4 POO 1/2) 21 F(U2) = 7/8 0 <0 a? ogre 0 rBt ©) E(X) = f ef(eyde = fo sete = f23%de = 4/4 4) Let ¥., Ye, Yo be independent uniform (0,1) randam variables. Then for 0 <0 PUSH) 2 OEzS1 1 bt 105 Section 4.5 so if X = maz(Yi, Yo, Ys) , then PIX S2) =P Se SM <2) =(P% <2)? 0 z<0 2 {» 0g =} 1 2d1 = Fle), Te Srluldy = fr(t)dt where ¢ 8) fey) = Ae7"2y, v2 0 b) BY = [PPA dE = Af Oe Made = GD = A = 0.51 when A= 3 jerse c.f. P(Y < y) = P(T <9") = 1-€ y=, fee 8. Let Li denote the lifetime component i, and let L denote the lifetime of the entire system. OY =u, say. Solu =e” so a) The system fails when and only when both components fail, so L = max( 1, a). If¢> 0 then PUL>¢ = P(max(Li, Lz) <1) Pll Stl <0) ~a ey 103), 0s 00 b) Here £ = min(Ly, £2). By Example 3, L has exponential distribution with rate 1/11 + 1/12. So i€t> 0 then PUL > t) = enn tii 106 Section 4.5 ) Here L = max(Lrop,Lastion), where, by Example 3, Lop has exponential distribution with rate (I/m + 1/2) and Lecttom has exponential distribution with rate (1/us + 1/us). Put a= 1a + Iya and b= 1/us + 1/n4- By part a), Z has survival function P(L >t) =1- (1-71 —e™™), > 0. 10 054 00 o H z 3 a 3 é 7 3 4) Here L = min (max(Es, £2), La). It > 0 then P(L > t) = P(max(L;, £2) >t, Ls > 9) = P(max(L:, La) > t) P(Ls >t) = (1-1 ely ety) emt, 10 os- 00 2a) X has the same distibution as F-'(W), fo B(x) = BFW) = [ ores = [ PY a)dy = thaded area (lnvegeate horizontal strips) But the shaded atea can also be represented, by integrating vertical strip, as . . [ov rete = f° ocx > aie by IX is a diserete random variable with values 0,1,2,... then Buy = [2 > nee 107 Section 4.5 (Or: the region above the distribution function of X consists of rectangles of width 1 and height 1- F(o) 1-FO 1- FQ ete. P(X>1), <) If X has exponential distribution with rate A, then X > 0 60 E(x) a P(X > 2) IC X has geometric (p) distribution on (1,2,3,-..} then by b) E(X) = Sew a= 4) X = X(U(X > 0) + 1(X < 0)] = XX > 0) ~ (XIX <0) = Xy ~ X- where Xp = XI(X > 0) and X- = (-X)M(X <0). So E(X) = B(X4) - B(X-). Since X4 > Bx) = [7 PU > nite = [mx > one ()area in diagram Since X- > 0, [fC 20> ue a P(X < -z)de L fl Per -10) 1 = P(X; > -10, Xa > =10, Xs > -10, X4 > -10) 1 [P(X > = 1 (9772) = 0881 since P(X, > 10) = P(Xi/5 > —2) = (2) = .9772. 8) P(X qn > 18) = 1 PX $18) 1 = P(X: $15, Xa $15, Xs $15, Xe S 15) 1 (P(X: < 15)" = 1 ~ (.9986)* = .0086 since P(X: < 15) = PU%/S <3) = 0 ¢) Let f and F denote the common density and distrib probability i function of the Xi’s. The desired 16 PC-M6 $ Xa) $ 1/6) Frey (204, where fx, is the density of the second order statis Sxq)(2) = ojo wera - FaI" = 12f(2)F(2)ft — FP. Over the interval [~1/6,1/6], this density is roughly constant, so the desired probability is approximately (length of interval) x (value of fia) at 0) = Fx (0) (9) F(O}U ~ FCO)? 1 2 aye 2) Oy 5 1 ova Remark. To compute the probability exactly, you may integrate by parts. Or you may argue as follows: Let Ny denote the number of X's which are < 1/6, and let Nz denote the number of X's which are < -1/6. Then Ni has binomial (4, F(1/6)) distribution, and No has binomial (4, F(=1/6)) distribution, Now argue that 3 0399. PUIG $ Xexy $ 1/8) = P(X) $ 1/8) ~ P(X) < -1/6) = P(N: > 2)~ PCa 2 2). 2. b) Let kD 1 integer. If X has beta distribution with integer parameters r,4, then BUX) = 0 2h aghye"H( — 2) de 109 Section 4.6 a) Hence Var(X) = E(X?) ~ (EX)? ranean ~ ( = EPpy aa) =a)" b) (L=2)"=(y— 2)” oy —(y~ 2)" 4) 1= (1-2) =9" 4-2)" ) Ga -0 1) (haat a + eta — 9" tape (y — =) = 4. a) PZ =1) = PZ = 0) = 1/2: B) yes, yes, equally likely, independent of the n order stat ©) the n! possible orders of the m variables are 5, Let F be the common ed. of the X's. 8) Xcayis less than or equal to 2 if and only if at least k of the X's are less than or equal to x (and the remainder are greater than z). Since each X has chance F(z) of being less than or equal to z, and the X's are independent, it follows that the number of X's which are less than or equal to z is a binomial (n, F(2)) random variable, so P(X) <2)= > (jeer = Fay. b) I 4,2 are positive integers: The rth order statistic of r +21 independent uniform (0,1) random variables has beta (r,s) distribution. But by part (a), its ed.f, must be yy (ee 7 ‘Veer ~ Fey" Finally note that the c.d.f of 2 uniform (0, 1) random variable satisfies F(z) = 2,02 <1 €) oi. B(X) = 7/10, SD(X) = v5 /10. *) A ' i 1s A 10 ! | ost as 00 00 00 05 v0 00 05 10 Deasisy Disirbuion function 3. X is the maximum of 3 independent uniform (0,1) variables: 0 ifrco miner sdariisomsamen= (2 Hicser 1 ife>t fxls)= frre { ees stale fopteiee f'eeeed 4a) PK C1) = fh sleds = f° 2petde + fh /a)ertade = 1 1/2)e74 b) EX) =0 by symmeuy: Vor(X) = E(X2) = fx2f(2)de = 2 f° 22 -(1/2)eFdz = Jo" ste ¥d: ©) ¥ =X? has range (0,00). For each y > 0 Fey) = P(X S 9) = PU-VES XS J) =2POS X “ “ 22 seyeena [Cormeen in Chapter 4: Review 3) PUT > 30) = 8) 10 < £20, then P(T > 1) = 2 . 1630 < 4 < 70, then PT > #) = St. c) HO $D(T) ~ 19.81. ©) Let I be the distance from one end of the road to the ambulance station, and T; be the response e. We may assume, without loss of generality, that 1 < 50. Then t<0 o 100-1 After a calculation similar to that of part d), ET) which is minimized when 1 = 50, So the expected response time is minimized when the station ig located at the midpoint of the road. G. T has exponential distribution with rate A= 1/48. os 00 os 00 a) The ed.l. of Tis, for 1 > 0 Fr(t) = PIT <8) 0 00CO}sCi‘SSCiSCaSCSRSC«a SSC b) U is distributed as min(T,48). If ¢> 48 then Fo(l) = Plmin(T48) <0 and if 0. < ¢< 48 then : Fo(t) = Plein(T, 48) <¢ ars It <0 then Fo(t) =0. U is neither discrete nor continuous. ee 20) 90) eee 00 pee 30 meter co retin 0 mee cg ects v0 ee) 112 Chapter 4: Review €) E(T) = 48 by assumption, and £(U) = Ermin(T.48) = [mince an gto = rnenars [” saprioe W8)d0 4 48P(T > 48) 5 48(1 = 2674) 4 48e~ 4) No. The policy of replacement at 48 hours is unnecessary, since, by the memoryless property of the exponential distribution, the remaining time tll failure of components which last beyond 48 hours has the same distribution as the lifetime of brand new components. Tea) La f Ne)de =f, aeP lds = 2a fede b) BUX) = f2f(2)de =0 since fis an even function; Var(X) = BX?) = 8 J. ¢) fy > 0 then P(IX| > 9] 2 Soan8. sted = 3. P(X > y) = f° Be Pds = 1- (fee D0 Oye r<0 4) P(X Sz 8 a) /fMe)dz€) 1/VIRO,1 €)2, 2/3, 4/18) 1/10, 5, 100/12 1)5, 1/5, 1/25 9. a) normal, mean a, variance $, factor = 3 ») normal, mean a, variance 6?/2, factor ) gamma (6,0) 4) bilateral exponential, parameter @ ) beta (8, 10) 9 feb aPde = L0(bbuyPhde, uw = Joo — wiPbdu = 8 f= wld 808? xis e 10. Note that (1/V#)e7* is the density function of a normal (0,1/2) random variable, say X a) fF ds = J, Seeds = Bo = 006. b) fo eds = VEP(O-< X <1) = VF [9(V/2) — 8(0)] = 746. 6) fed a) fo tet ds = fee Chapter 4: Review Mea) 1/2 b) Compare with the gamma (r,4) density: ifr > 0,4 >0 cae [tens So 100* 30 “(00 — 2yfas =100° fa = wee i [ 12, PL 1) P(T > 3) = P(N, <2) ~ P(Ns <2) where M; denotes the number of hits in the time interval (0, t]. Since N, has Poisson (21) distribution, we have vp eentteogtul gy yy)" 14, a) This is equilavent to finding P(T, > 2) where T; is gamma(4.3), so PIT, > 2)= Soe aot Fe +6 + 18-426) = 0.1512 b) Because of the memoryless property of the Poisson process, Ti, Ts ~ Ti, and 73 ~ Tz are all independent exponentials, so this is just P(T; <1)? = (1)? = 0.858 c) Given that there were 10 arrivals in the first 4 minutes, the times of the actual arrivals are uniform on (0,4), so the distribution of tHe number of arrivals in the first 2 minutes is binomial (19..5). 1G. a) As soon as the fist car comes, we are waiting for two more cars to come which gives us a sgamma(2,3), call it 7. P(r <3) 3.9988 -Ye 149) 1) In a ten-minute interval, the probability of 1S cars is ¢W%3212 = 0.001027. Given that there were 15 ears, the probability of 10 Japanese cars is (!3).640.4® = 0,0007378, assuming that the chance of being Japanese is independent {rom car to car. Further assuming that the number of ‘ears which appear in a given time interval is independent of whether the cars are Japanese, the chance of both events occuring is 0.001027 x 0.0007378 = 7.876 x 10~7. 114 Chapter 4: Review 17. IFT has exponential distribution with rate 2, then for all 0 << 00 we have PIT <3) Hooda [rave Conversely, if T is such that P(T' < for all 0< t < 00, then for 0 Sa H 19. a) (202) log, 10 1b) 20g, 10 ~ log. 2 20. The assumption indicates T = T; + Ts, where T; and T ate independent exponential (A) random variables. Hence T has gamma (2, ) distribution. a 0, ») a He 120 14 Ae 12 0 4) For A= 1, ¢=2 the hazard cate is 8% = 3 per hor, Given survival to ¢ = 2 hours, the probability of failure in the next minute (1/60 hour) is approximately (hazard rate) x (length of interval) we 21. PUR $4) 2) fry) = me" 20 8) exponential (1) 31 22, a) The range of Y is [0,6/2}. 160 < y a/2 then P(Y < y) by Not continuous, since the distribution function is not a continuous function of y ¢) E(Y) = E(min(X,0/2)) = f min(z,0)2)fx(z)dx = [22 Or use E(Y) = J" P(Y > v)dy = fo? - Day 23, a) EM = (M~3)43=5, Ver(M) = Var(M —3) = elt = te, where W = M3. So X > e? and 1 22 ~Hlogtsen* Sale a £2 ) P(M > 4) = P(M -3> 1) =e74 50 the probability is (4) 24. Let ¥ denote the time elapsed in minutes between the instant the lights last turned red and the instant the car attives at the lights. By assumption, Y has uniform (0, 2) distribution. And ley wocycs reo NSE 115 Chapter 4: Review a) The range of X is [0,1]. If 0.<2 <1 then PIX $2) = PIX S206 Y <1) + P(X S21 <¥ <2) =PU-¥ $2,0<¥<1)+POSz,1<¥ <2) =PU-2S¥<1)+Pcy <2) a te=#) 2-1 105 os! b) neither. ) X = 4(¥) where gly) = yil0 Sete) a1 ata 26. 2) Yee on (0.1/2 0) anim on (1 ¢) EY = 1 Var(¥) = 26,9) FUKEY) = BUXIELE) 2x [2x ft eal = Hee) b) E(W?) = E(X?)B(e*”) = $(e — e*). SD(W.) = JF == PTR 27. a) For each n> 2 we have PU, Suand Waa) = Fi" Suand N>n—1)— PW, Suand N>n) PU Se SU Su) Pn $ Unen So Sr Now for each > 1 we have PUn SU Su) = Ps Su, U2 Su, -.-0n $4) since each of the n! orderings of Ui,...,Un is equally likely. ‘Therefore P(r Sw and N =n) = 116 Chapter 4: Review ») PU; < wand N is even) ¢) From a) with u=1, P(N >a) = 1/(a— tail sum formula of Exercise 3.4.20, 20)= aay Lan ! for n > 2, and this is true also for n = |. So by the 28. Let © € (0, x] be the angle subtended by the (shorter) arc between the random point and the arbitrary fixed point. Then by assumption, © has uniform (0,x) distribution. Observe that sin(@/2) = X (or use the Law of Cosines); therefore © = 2 arcsin(X). a) If0 <2 <1 then P(X 0 has been set, and you choose b> 0. Let G be your net gain. Then aftr tx>0 o={'ts HS £(G) = (1 = 68) P(X > 0) + (—1 = eb) P(X <0) (X > 0)— P(X <0)~cb = 2P(X>0)~1-cb = 28(b) eb since P(X > 0) = P (4¢4 > $54) = 46). ‘So the option is advantageous when and only when there exists 5 > 0 such that E(G) > 0. i. and only when there exists b> 0 such that when L+eb 20(b)—1- cb 30 => (4) > Such a b exists when and only when § < 6'(0) [One way to see this: Plot (6) and 142 as functions {0.8} Since #'(0) = 4(0) = 1/V/r, conclude that the option is advantageous to you when € < 8), For such othe expected net gain is maximized (why not minimized?) at bsatsying, 2 BG) = men fate) <0 = owas ve ht side of the last equation is less than 1 (why?), so the equation has a solution ) 30. a) Let X be the diameter of a genetic ball bearing. Then X has normal distribution with mean = 280 and standard deviation ¢ = 001, so the chance that the bearing meets specification Gis PUK ~ nl <2) = 0.6828 We want E(Tie), where Tye is the number of trials until the 16th success in Bernoulli(p = .6828) ls; 20 E(Te) = 16/0.6828 = 23.4, 117 Chapter 4: Review b) Let ¥ be the diameter of a ball bearing that meets specification (i). Then ¥ has expectation E(Y) = E(X|X ~ ul <0) = EloZ + ull <1) =H has standard normal distribution) and second moment B (XX — wl SD(Y) = o EMA T). B(2"1(121 <»)) Pel <=) _ Lele de “PUR <2) ~226(2) + f, dle)de Pal <=) 229(2) Pz < 3)" E(Z*|21 < 2) = hence SD(Y) = oy pag ‘The required probability is 397 x 10-*. P(3.995 < Ya toot Via < 4.005) whee Yio Ya ae independent copies of ¥. By the normal apporinaton, this oP (ia) < SEI) = puzi caste) ~ 98 31. No Solution 118 ©1993 Springer-Verlag New York, Inc. Section 5.1 A. The area of the set of interest is 6. a) P(X <1) = indicated area b) POY < x?) % 2. a) Poin 091 acer 09 = ©) Plime measurements within 00 inchs) = 1 = 2220290 — goare 2 Bot oi oo) us b) p(B —u<2) =e(txeve Section 5.1 shaded region = (¥ 2X, ¥ 2.25} PY > iY > 28) 3 = indicated area/$ 5. a) The percentile rauk X of a student picked at random has uniform distribution on (0,1), so P(X > 0.9) =0.1. b) The percentile ranks X, Y of students picked independently at random are independent uniform (0,1) random variables, 0 PUX = ¥1 > 0.1) = (9/10)? Gps G. a) P(Jack arrives at least two minutes before Jill) = aE 0.16 b) Let F = {first person arrives before 12:05) and £ = {last person arrives after 12:10) Then, P(FL) = 1 ~ P(FL)‘) = 1 = POPUL) 1 = PCF) ~ PUL) + PCF nL) 1-2 ey 4S 0.965 T. a) P(M > 2) = shaded atea = (12). 120 Section 5.1 b) oc <1 then PUM <2) ite > 1 then P(M <2)= 1; itz €0 then P(M <2) P(M > 2) ~G-e So the density of Mis given by tutey= { wi-a) a # and Ua) <9) = Ple < Us, Ua,.--Un <9) = Plz < Uy # 22d Un) <9) "—(y— 2)" 9, Standardize the length of the stick to be 1 (the solution clearly will not depend on the length of the stick), Look at one end of the stick, and let X and Y denote the distances from that end of the stick to the break points, Then X and ¥ ate independent uniform (0,1) random variables. Let L denote the minimum of X and Y, and R the maximum. (L to suggest left, R to suggest right.) Then the lengths of the broken pieces can be expessed as L, R—L,1—R. To form a triangle, the maximum of these three should be less than the sum of the rest. That is, triangle <= max(L,R—L.1— R) <1 —max{L,R— L.1—R) ee max{L, R= L1- 8) < 1/2 =p Lc iftand R-L< fd and /2 E> 0) (apeyn PE (E sia 5( (|) a] 3 axne “ASIN A= A uaa song (9 eh th OS soa TnomPer “AVE ZT [-» vecanen J- (ein 5 ad | super sey 22a ayy “AoqesouaH Jo sso} IMO“ wummupen “unuspura (2 cy @ sole (2-00 f= ze | e901 J = php (2 ~ fos 7 J =(xe< Ad © we woTDag Section 5.2 10, Foro w) =1- PW wde= ['1-vtten$ PV > 0) =(1-0)* EW) -{ PU Dw) u-oter=! 21a) = 50) - BV) =2 b Fe ocvewel PU € dv, W € dw) = P(no U; in (0,0), 1 in de, 3 in (v,w), 1 im dw) OP (Us € dv, v < U2.Us,0s < w.Us € dw) ° P(R> 0) = Lf[~ 20(w — vy dedw -0 f [a =! wt = (0.5)'dw wos) { Section 5.2 15. a) F(b.d)~ Fla,d)~ Fle) + Flee) b) Flew) = So, foe Mle, vue. 9 Mew) = SSF ew) a) Flay) = Fx(e)Fr(y)- ©) F(z,y) = Plmin <= and max < y) = P(max = and max Wat RIC ta Fs) AT. a) Note that -R<&X ott +8 Be (Proof: Rearrange (a 5)? > 0 to get 2(a +62) > (a +6)?, then take 5 are roots. Ox: Let W bbe a random variable taking values a, with probability 1/2. Then #3 (24571) Apply the above inequality to X and ¥, take expectations to get BW?) > (EW) = 5m) = EVE > B (AGE) = ASN 9 Bea VE In fact, the inequality is strict, since P(VEF FY? = 24%) = P(X = ¥) = 0, hence R= /XFEY? > 24% with probability 1, and so B(R) > BOSE). For the second inequality, note that > 0 implies (BR) < BUR?) = BUX? + ¥?) = 2B?) = } 4 5¢a)= [Sears [20 (5 wee!) or ab feet [rea [Ae F arceos ar HEL eaena [ce ante ace) To evaluate the second integral, use integration by parts: fond wsarn [9.dae Hence E(V3)" — E [VI4 logv2+1)] } 3 [V8+ log( v2 + 1)] = 0.7651987 xt (v2-+1)] You can check that /F < 265 < /F. 2) B(Deorn a. 165, E( Deemer) = ElDeorncr)/2 = 3806. 8) Let the two points be (Xi, ¥i) and (a. ¥4). &(D*) E(X: ~ Xe) +(¥i - %2)?] B{(s)"] — 4B X2) (by linearity and symmetry) [e460 6.%) (repent “i-avbee €) Since Var(D) = £(D?) —(E(D)P > 0, we have B(D) < YE(D¥) = 1/V5 = 0.577 128 Section 5.2 4) Ifo? = Var(D) were known, then a 95% confidence interval for £(D), based on n independent observations of D, would be . ve where D = average of m observations. Here n = 10,000, D = is (£2) aa ORF = 0.2088, So an approximate 95% confidence interval for £(D) is D£(1.96) x 5197, and we estimate « by (2.96) x (0.2468) 0197 & AD, = 0.5197 + 0.0048, 129 Section 5.3 Section 5.3 11. Using the same notation as in Example 1, ae HP oe sts. a) P(R < 1/2) = Fa(1/2) = b) $PC.< 8 <2) =} (Fa(2) ~ Fatay}= § [Qe ”)- G4) vn ©) BUY) = fll ge Pdy = 2 [O° vege Pay = 4) PUX| <1) = 2P(0.< X < VBRRD) = 2 x (881 — 5 ¢) By independence of X and ¥, P(X ay -8)= Pex av 45>0)=1-9( soma 3 a)1- 805) W122 95 av 4. a) By symmetry, this must be 0.5. b) PIV =.2< X <¥) = P(-.2< X~Y <0) and X—Y is normal with mean 0 and variance .4, )-e( 5. Say ¥ has standard deviation g. Then X has normal (0,1) distribution, ¥ has normal (1,*) distr bution, and X — ¥ has normal (—1,1 407) distribution; hence 2 a Peacx-vco=0( P(X SY) = P(X-¥ <0) 43 and o = (41/437 =1)'? 24 Ga) 3X +2Y has the normal(0, 13) distribution, rosa sn 07 (G2 >) vis Vi3, = (1.29) = 0.0823 b) By using inclusion-exclusion, P(min( XY) < 1) = (XK 1) 4 P(-1.< ¥ < Land X > 1) = P(-1< X cand ¥ > -1) 4 P(-1<.X <1and ¥<-1) =P(-1< X <1) = (1) - &(-1) 6826 4) By using the rotational symmetry of the standard bivatiate normal distribution, P(min(X,¥) > max(X,¥) = 1 = P(mas(X,¥) ~ min(X,Y) <1) = P(X -¥1-<1) 1 1 we(-Lexet ( va <*< x) = 6(071) - @(-0.71) = osz22 1. Let X be the (actual) arial time ofthe bus, measured in minutes relative to 8 AM (so X = 1 means ‘at the bas arrived at 801 AM), Let ¥ be ny arival me, also measured in minutes eclative to 8 AM. Thea X has normal distribution with mean 10 and standard deviation 2/3, and Y has normal distribution with mean 9 and standard deviation 1/2. a) PY <10) =P (at < ae) (2) ) Assume X and Y are independent. Then ¥ — X has normal distribution with mean —1 and variance (2/3)? + (1/2)? = 25/36, and 972 rey cxy= ry -x<0)ep (Ett zum) 8/57) = 90.2) = 08800. €) PIX <91X € (9,12) = ae SEs = HOS Atksey = 9795 since P(X <9) = P (4738 < 9532) = 0-3) = 0668 and P(X > 12) = P (Aft > Uglt) = 1 - 8(9) = 014 8, Let X be Peter's arrival time and Y be Paul’s arrival time, both measured in minutes after noon. a) Since X —¥ has the normal(~2, 34) distribution, X-¥42_ 2 P(X < ¥) = PATE c Fe (0.34) = 0.8331 b) By independence, P(-3< X <3and —3<¥ <3) = P(-3< X <3)P(-3<¥ <3) -&@)-*(@k@-*G) 2626 PUX -Y¥i<3)= csex-¥ cayno (248) -0(=g2) =o) -0(-017) saan 2) Pla i) > 16 51 [P(X < 76)" 1 — {@(3)]'°° = 1 — (.9986)'% = .1307. 131 P(max(X1,.--,Xr00) $76) Section 5.3 10. n. 12, 0) White ¥ = 2433640, We want P(Y > 108). Now £{Y) = E(X2) = 70, SD(Y) = 224 3) = 02, and Y's normal distribution since itis a Uneat combination of independent normal random variables. Hence the desired probability is y-10 0 a> aE) P(Y > 70.8) = (2.5) = 0.9938 = .0062. €) The answer to (b) will be approximately the same as before, since ¥ still has expected value 70 and SD 0.2; we assume the normal approximation is good. On the other hand, the answer to (a) will be exactly 1 ~ [P(X1 < 76)}!, but now there is no guarantee that P(X; < 76) is still 9986; hence raising the actual probability to the 100th power could make the answer very different from before a) Let X; and Xa be the two incomes in dollars. p (Bee > 1,000) (gee 60,000 > sta caon 10,000/V3 ” ~ 10,000/V2 ‘8(0.1) = 0.2389 bb) Let X and ¥ be the younger and older persons income respectively. X ~Y has the normal(~$20, 000, $14,142) distribution. rea v= P( = e041 X=¥-420,000 . _20,000 Vit V4, 143 0.0793 ©) By independence, P(mia(X, ¥) > 80,000) = P(X > 50,000) P(Y > 50,000) 1-40) = 4-1) 1395 2) Say that X_ has normal distribution with mean m(i) and variance o(0)- Clearly Xo = 0, 20 sm(0) = 0(0) = 0. Next argue that m(s) + m(t) = m(s-+¢) and o(s) + o(@) = o(s+ ) for all 4,1 > 0; conclude that m(t) = 0 for all ¢ > 0 and that v(t) = to? for all t > 0. Therefore X, has normal distribution with mean 0 and variance to, 'b) X« and ¥; are independent normal (0,02) random variables, co R= 2)= P(R>2) = Var(D) = E(D*) - [E(D)P = 4. 13. We may assume # = 1 by considering X* = X/o and Y* = ¥/a. 4. 2) Refine the argument used to derive the density of : ‘The event (2 € dr, © € dé) corresponds to (X,Y) falling in an infinitesimal region having area #2 x 2xrdr = rdrd®. (Argue that (X,Y) rast fall in an annulus around the origin of infinitesimal width dr and radius r, but the angle subtended by (X,Y) is restricted to be between @ and 6 +d9,) The joint density of (X,Y) has neatly constant value e"#"” over this region, 0 Pine dro a) =e}? wed = Le ¥a, Condes (0) bas int deity eA > 0,0<0< 2K. Snol.9) = 3 Integrate out to get the marginal densities of and O, sce that R has Rayleigh distribution, © has uniform(0, 2x) distribution, and R and © are independent ( fr,0(r, 9) = fa(r)fo(6) J. b) Let_X and ¥ be independent normal (0,0*) random variables. Define R and © by R = VRTTY? and © = arctan(¥/X). By part (a), we know R and © ate independent, and R/o thas Rayleigh distribution, © has uniform(0, 2x) distribution, Since X and Y are independent normal (0,07) variables, and since the joint distribution of X and Y is determined by the joint distribution of R and © (namely, via the transformation X = Reos©, ¥ = Rsin@), the same conclusion (that X and Y are indep normal (0, 0%) variables) holds whenever R and © have this Joint distribution. ©) Using the results of (b), it suffices to find h,k such that A(U) has Rayleigh distribution and E(V) has uniform(0, 2x) distribution. Clearly we may take K(v) = 2xv. As for h, note that the distribation function of a random variable having Rayleigh distribution is, Fo) a The inverse of F is F-'(u) = //—2log(T—w) - By the result given in Section 4.5 on inverse distribution functions and simulation of random variables, F~"(U) is a random variable having distribution function F. In other words, F~*(U) has Rayleigh distribution. So it suffices to take 7 that is, A(u) = /—2log(t forr > 0. *). a) Since the standard bivariate normal density is rotationally symmetric, @ is distributed uniform(0, 2). ‘This implies that 20 mod 2x is also distributed uniform(0, 2). b) Since cos(a mod 2x) = cos(a) and sin(a, mod 2x) = sin(a), Exercise 13 part b) implies that, Roos 20 and Rein 26 are independent standard normals. ©) Since R= VXTFY?, 2XY_ _ (Recs )(Rsin®) _ p Soe, - Ein) = Rsin20 VRTHVE R x cos? - Risin? Roog ViTePE ‘They ate independent standard normals a) Let ¥ = 2, Clearly f(y) = 0 for y <0. For y > 0, use Example § of Section 4.4 Joly) = BLDG = 2A py Nga = S(up2}layberol® hati, ¥ has gamma(1/2, 1/2) distribution. The fact that [(1/2) = VF follows from f° fr(y}y = 1 133 Section 5.3 b) Write n = 2k —1, and use induction on & = 1,2, ... , then Is true for & = 1 by part (a). If true for 2k _2k=1 Va(2k 2)! 33 therefore true for K+ 1. ¢) (X/o)? has gamma (1/2, 1/2) distribution by (a). That is, ¥ = X?/o? has density Sey) = Fee? y > 0. So by a linear change of variable, X? = o7Y has density a yiPag—we?, Jxa(w) = defr( BS) = Fels) hat is, X? has gamma (1/2, zl) disteibution, 4) 23....,22 are independent gamma (1/2,1/2) variables, by (a). Since 2?-+-+-+Z2 has gamma, (n/2.172) distribution. ¢) Foreach j, ¥; has gamma (K;/2,1/2) distribution (by definition of x*). ‘The ¥j's are independent, Yi tos You has gamma (ky +--+ bn)/2,1/2) distribution, i.., has x? distribution with ky ++ ky degrees of freedom. 1G. Since Rijy has the gamma(m,1/2) distribution, it has the same distribution as the time of the smith arrival of a Poisson process with rate 1/2. Let Tm be the time of the mth arrival of a Poisson process with rate 1/2. Por z > 0, Ri has edt F(z) = P(Rm <2) = P(Tm <2) 1 = P(Tm > 2) = 1 ~ Plat most m —1 arrivals in (0,)) bb) Since P(Rim $2) = P(Rim 2"), Ram has edt. P(Rim $2)=1- ) Form PUR < x)= 08 roe P(Ra <2) 7 T z 7 7 3 PUR, S7) || 0.0907 | 0.5040 | 0.9389 | 0.9970 | 0.9999 17, a) Skew-normal approximations: 0.1377, 0.5940, 0.9196, 0.9998, 1.0000 ‘Compare to the exact values: 0.0902, 0.5940, 0.9389, 0.9970, 1.0000 b) 0.441, 0.499. Skew-normal is better. 134 Section 5.3 18. The X's are independent and identically disteibuted uniform [-1/2, 1/2] random variables, so £(X:) Oand Ver(X:) = 1/12. Hence £(X) =0 and Var(X) = 1/12n . Since X is the average of id random variables having finite variance, it follows that for large m, X has approximately normal dist with mean 0 and variance 1/12n, and VT2nX has approxi argument applies to Y and Z, too. Since X,Y, and Z a ration ately standard normal distribution. This, jependent, we then have that (Jima x}? + (ViRWY)? + (VTRZ)? = 120(X? + 7 + 2?) = 12m has approximately chi-squated distribution with 3 degrees of freedom. Hence 098 = Pita! <782) = P (es /®) . He So Ris 98% sure to be smaller than 135 Section 4 Section 5.4 1.) Brom the pete: P(X 4X <2) = pada aen = 2 b) See the pictures below: If0.<2 <1, P(X: + X2 € de) = CAMettP atalahe ignoring (dz)? as negligible in comparison to az); Wisesz, P(X: + Xa € ds) = $ (ates of « parallelogram); W2sess, P(X; + Xa € de) = CA=eP Gp4o—ende? =. O=NMs (again ignoring (dz)? ). ‘Therefore 2p o3 | a) fitder fA 0-oee= 178 Or. P(S; £18) = 1- P(S: $05) b) 1/2, by symmetry of the density of Ss. e) fi? edt 2" (1-22-08 — He ay) at 4) Approximately fs,(1) x 0.001 = (1/2) x 0.00 3. Let X be the waiting time in the frst queue, Y the waiting time in the second. Then X and Y are independent exponential random variables with rates @ and f respectively. a) From the convolution formula, Sxav(e) =f, fxte)fvls ~ 2)de = ff ae"** pede, 2 > 0. the integral becomes acm f'dz = a? 2e-** (density of gamma (2,0). Ia B, the integral becomes fie +), Sketch of the density in BR dem a8 feds m Bho —e oot oO b) B+) ©) Var(X + ¥) = Var(X) + Varl¥) = ob + fe = SD(X +Y) 137 Section 5.4 4, a) Let Ty be the time to the fist failure, Ty the time till both fail. ‘Then Tj has constant hazard rate 2A, so has exponential distribution with rate 2A. Also, T; ~T; is independent of T; with the same distribution. So T; =T; + (Tz ~T)) has gamma (2.V) distribution. b) Use the fact that 7, is the sum of two independent exponential (24) random variables. (Ts) =2 x Var(Ts) =2 x obs = ah Por each ¢> 0: PUD St) = fi(2nytte™at = [2™ zed We want {such that 1= (14 2M)e™ = 9 eae OM = 10(1 +240). ‘The equation e¥ = 10(1 +y) has one positive solution, namely y* © 3.88972, so the desired t is tae = (14 2a 5. a) The range of X +Y is the interval [1,7]. If1 <¢<7 then PUXAY € dt) = P(X = int). int) < ¥ <= int(t) + dt) = Bat So X + ¥ has uniform distribution over (1,1] and 10(X + ¥) has uniform distribution over (00, 70, b) HEH = 0.483 6, gamma (ry + r2-4++-4+ ra,A). Use the result for two variables given in Example 2, and induction on a) Write Z= XY. Let 2 > 0: ovr Uf one \ 0 pevae ‘The event (X € dz, Z € dz) is represented by the shaded region. This region is approximately a parallelogram, with left side having length #, and distance between the vertical sides ‘=, s0 the area of the parallelogram is approximately 2jdzdz. The joint density of X and Y ow this small parallelogram has nearly constant value fx,y(z, £) - Conclude: P(X € dz, XY € dz) Sxav(e, ‘This argument works for + <0 too. Integrate out £ to get teve= [” . 138 jPxvtas Ede. Section 5.4 b) Write Z =X -¥. Let z be arbitrary. From diagram, we get P(X € d2,X —~¥ € dz) = fay(z.2—2)dede, whence frovted= [ fevlece~ nie 6) Let Z =X 42Y. For arbitrary = we have: From diagram, obtain P(X € dz, X 4 2Y € de) = fxr(c, seve = | and 139 Section 5.4 5 L f. Sole) frlg)duds = [own (02) a Iz <0 then Fxyy(z)=0 9 Sx(z) 10. log) (O<2<1) 2 o1 ° otherwise LA, uniform (0,1) 12, X = —log(U(1 ~ V)} = —log U ~ log(1 ~ V). Since 1 — V is also uniform on (0, 1], we just need to find the distribution of —logU. For = > 0, P(=logU <2) = PU Be .e X is the sum of two independent mn. Its density is f(z) = ze"* (z > 0). 20 ~logU is an exponential random variable with rate 1. ‘exponentials with rate 1, it has the gamma(2,1) distribu E(X) =? and Var(X) 13. f2(2) eH 14, Ifthe joint distr of (X,Y) is symmetric under rotations, then the probability that (X,Y) falls in a sector centered at (0,0) subtending an angle 8 is 0/2. So let c € R. We have p(B ce) omy Sek 90) 202k X 0 Case <0 From diagram, the corresponding region in the plane is a pair of sectors centered at (0,0), each sector subtending an angle of ¥ +arctanc, where arctanc is that unique angle in (~$, $] having tangent c. Hence P (64 c) = 2P ( (X.Y) falls in a sector subtending an angle of § + arctan ¢) Ap Harctane) 1 1 = Meta) 14 Lactane So ¥/X is a continuous random variable having density 15. 16. a. 18. Section 5.4 a) Let AY, Then X1 and ¥; are independent exponential (1) variables, and 2 = min( Xa, ¥)/ maxX, ¥i)- by PZ <2) =28/(1+2) /( + 2) for 0< Ec. a) Fier and t FUd,0) = PL St) = POT T’, we have Feat PIT St) = PIT + T" St) < PIT St) = Fr.) a) Let 0.¢ 1€ 1. The plane that cuts a (1/3,1/8,4/3) has equation x+y 4 £ = t, The plane that cuts at ((C44)/3, (£4 A)/3,(€-4+ K)/9), A small postive has equation z-+y-+ == 14h, These two planes determine a “slab” of uniform thickness h/V3. The desired cross-sectional area is volume of slab, volume((rip.2):tSetytz Stth) Bm Thickness of sab ~ 2% ava = tim PUSX#V4ZSt4h) fy ahi = Vifxsvez(0 where X,Y.Z are independent uniform (0,1] random variables. Hence the area of the cross- section at (t/3,1/3,t/3) is $e octca Val-P 40-3) ete? (3-97 2? then Inlt) = S540 (2) =f fo-aloypngte = wide = [tated = Facile) ~ Fraale = &) Claim: For each m > 1; For all (1) On the terval (= 1.3), FE) 8 & polynomial having leading term SEIS! (S=H)a°-¥; (1; On the interval @~ 1,2), Faz) i polynomial having leading term SU" (*=!o 141 Section 5.4 Proof: Induction. True for n= 1. If holds for n (D): For each i Searle) where & > 1, then ok +1: On the interval (i — 1,4) we have Fz) - az -1) B= (Fj )te =m tems of degee at mont & -1 2 pe 4+ terms of degsee at most = 1 ‘ Me te of degree at most k — (2) semesters wa - ee (ait (eei-2) = & polynomial having leading term = Note that this argument works for the “boundary” cases i= 1 and i= k-+1. (U1): Por each i= 1,...,k + 1: On the interval (i ~ 1, i) we have Fenle) naoars [ folie (<2) mest ass Byte most {pte ene of dg me = polynomial having leading term [0 So claim holds for n =k +1, ¢) Claim: For each n= 1,2,... we have: If 0. <2 <1 then & (i) Pale) = 5. Prooft Holds for n = 1. If holds for n = k, where k > 1, then for z € (0,1) (fale) ey : and . Fanle) aa so claim holds for n = £41 4) Claims For each wehave: In = I< 1, then for x € (E,4-+ 1) fools) = Re) = Renta 1 hleni=t— (1 = GEUE) - wt at (k41-2)*t! Fags(z) = P(Su4i $ 2) za c Sroi(de = 20 dlaitn holds for n= E+ 1 Section 5.4 €) POS Se <1) = Full) =F by (e). Puss PUS: £2) - P(Se £1) 2) POS <5. <2)= f°, falt)at. Nowit1 0. Bxla) = Gye tM 0, and fry) = Fy” So the joint density of X/(X + ¥) and X+¥ is Ixpexavyxay(e.2) = [elf (we (1 > w)e) = #fx(we)fr((1- w)=) BR are ere ee = Fp tesy Fal — wel i. ED tg yt A reigns = rare On Ry if 0 0; the density is 2er0 otherwise. Integrate out to see that peartye tee eo (we knew this already); and that Sanayi) -w)"0 cue: and that fricxeyy.xsy(w2) = fxyxs yyw): fxer(e). Thus X/(X + ¥) has beta distribution with parameters 7,3, and is independent of X + ¥. 143 Chapter 5: Review Chapter 5: Review 1. Draw a picture of the unit equare; then compute 3. POY > 1/2,¥ > X?) PY > X?) PUY 2 aly 2x3) = since numerator = f',, vidy = 3(1- 2) a) 3/4. b) Density of Z= 1X4] is fol) = 2/2, 0<¢2< 2.80 axarnicea) [sees ['e-Zyaee a) Assume center of coin lands uniformly on dise of radius 2.5" ° 1) GF Pllands in ase of adas 3") = Zar » Center lands uniformly in square of side 4.5, #028) _ o.osszs... sy oy 1 = et sguate with ie US $V) _ 6 apgg 2) P(X? + ¥? <+°) is one fourth the area ofa circle of radius inside a square with the same center and side length 2. For r > V2, this is 1. For r <1, this is "2. For 1 1/2) = shaded area b) Let the coordinates of the point be (X,Y), where without loss of generality X has uniform distribution on (0,1), and ¥ has uniform distribution on (1/2, 1/2). Take the midpoint of the tiven side to be (0,0). Then D? = X? + ¥7 by Pythagoras’ theorem, and 1 1 a0) = 2X) + 60%) = fae fay = 1. lo an 6, The charge C per call in dollars is e-f} uT< = (ito x6r-i)=4e6r TDI where T is the call duration in minutes. So ners [snot [7 esos +6 [0- vow where J is the density of T. a) Here f() =e, 50 J, (C= YF (ae = by Here f(t) = (1/2)e", so J, ©) Here f(0) = ate, 20 f +, So E(C) =1+.6e~ ys (oae (€= Ws (iydt = 26°, So E(C) = 14 1.267 2. 20-8, So E(C) = 1+ 122M? = 1.73, 1. X has normal dsteibution with mean wand SD 0.1, so PUR = ul 228) = 2(1 ~ 0(.25/.1)) = 2(1 ~ 0(2.5)) = 211 — 9938) = 0124. 8. By the central limit theorem, the distribution of J is opprozimately normal, with the same mean and variance. So the required probability is now approximately 0.0124 . [nw is 100, which is pretty large. So the normal approximation should be good] + yA oz) ntre{ 2 <0, PUZ > 2) = PIX > BY > 2) = PIX > PY > a) =e nce PZ $2) OH and faz) = + werOM 2 >0 Manette 7 May = xia 145

Вам также может понравиться