Вы находитесь на странице: 1из 7

18.

440 PS2
Author: Eric Emer March 1, 2013

Chapter 1: Problem 5

Collaborators: No collaborators. We can choose 5 women from the group of 10 women in 10 5 = 252 ways. We choose 5 men from the group of 12 in 12 = 792 ways. Once we have selected the two groups of ve people, we must 5 pair them. There are 5! pairings. Therefore we are left with: 10 5 12 5! = 252 792 120 = 23950080 possible results 5

Chapter 2, Theoretical Exercise 14

Collaborators: No collaborators. Consulted the book Introduction to Probability with R by Kenneth Baclawski. Proposition 2.1. P (n i=1 Ei ) =
n

(1)r+1
r=1 i1 <...<ir

P (Ei1 . . . Eir )

We seek to prove the above proposition. The proposition shows a method for computing the probability of an element from sets E1 to En being selected without overcounting the intersections of these sets. We see that the right hand side of the equation adds the probability of the intersection of Ei1 . . . Eir for odd values of r. It subtracts the probability of the intersection of Ei1 . . . Eir for even values of r. This is iterated from r = 1 to r = n. This way, an intersection of every possible size is involved in the computation. While we might understand the methodology, it is dicult to put into English in a clear manner other than to state that it is counting each part of the union E1 . . . En once and only once. Proof. We will perform this proof by converting the proposition from a proposition of probabilities, to an equivalent expression of the expectation of a variety of indicator variables. Firstly, we can expand both the left-hand side and right-hand side expressions: P (E1 E2 . . .En ) = P (Ei )
i<j

P (Ei Ej )+
i<j<k

P (Ei Ej Ek ). . .+(1)n+1 P (E1 E2 . . .En )

This form is easier to work with, and also easier to understand. We notice that it is the expansion of the proposition. This equation shows that we add or subtract the probability of the intersection 1

of Ei . . . EN where N is iterated over the domain [1, n]. We add for odd values of N , we subtract for negative values of N . By inspection we conrm that this is equivalent to the initial proposition. In order to prove that this proposition holds, we must rst dene a random variable, as demonstrated in Lecture 8. Suppose for an element x, we have the binary indicator IE , telling us whether or not x is in E . IE (x) = 1 :xE 0 :x /E

We can also easily compute the probability distribution for our indicator variable:
c P (E )

pn =

P (E ) P (E )

: if n = 0 : if n = 1 : otherwise

Unsurprisingly, we see that the probabilities can be reduced to taking expectations. For instance, the expectation of our random variable IE is: E(IE ) = 0 P (E c ) + 1 P (E ) = P (E ) Now that we have shown the most fundamental indicator variable in this formula, we will expand our list of dened indicator variables. We also wish to dene a random variable for: IE1 E2 ...En We do this because we know inherently that: E(IE1 E2 ...En ) = P (n i=1 Ei ) We expect to be able to manipulate the expansion of IE1 E2 ...En in such a way that we can prove the original proposition. To do this, we apply de Morgans Laws as discussed in Lecture 3 to E1 E2 . . . En :
c E c ...E c IE1 E2 ...En = 1 IE1 n 2

IE1 E2 ...En = 1 (1 IE1 )(1 IE2 ) . . . (1 IEn ) Giving us a random variable for every set in E1 . . . En . Each one is a binary indicator telling us whether or not x is in that set. They all have the same form as the one explicitly shown above for IE . Expanding the above equation using multiplication, we get: 1 (1 IE1 )(1 IE2 ) . . . (1 IEn ) = 1 (1
i

IEi +
i<j

IEi IEj . . . + (1)n+1 IE1 IE2 . . . IEn ) (1)

=
i

IEi
i<j

IEi IEj + . . . + (1)n+1 IE1 E2 ...En

Taking the expectation of (1), we get: P (Ei Ej Ek ). . .+(1)n+1 P (E1 E2 . . .En )

P (E1 E2 . . .En ) = P (Ei )


i<j

P (Ei Ej )+
i<j<k

And nally, we have reverted back to a familiar form of the proposition. This can be condensed once more to the form:
n

P (n i=1 Ei ) thus proving the initial proposition.

(1)r+1
r=1 i1 <...<ir

P (Ei1 . . . Eir )

Chapter 3, Problem 32

Collaborators: No collaborators.

3.1

Let us dene events N1 , N2 , N3 , N4 , each denoting the event of the family having Nj children. We allow the event E to represent choosing the eldest child from the family. We seek to calculate: P (N1 |E ) = P (E |N1 )P (N1 ) P (E )

Some aspects of this probability are straightforward to calculate and others are not. We see that P (E |N1 ) = 1 because there is only one child in the family, so the child selected must be the eldest. In addition, P (N1 ) = p1 = 0.1. Lastly, we must calculate P (E ). This probability can be found by examining: P (E ) = P (E |N1 )P (N1 ) + P (E |N2 )P (N2 ) + P (E |N3 )P (N3 ) + P (E |N4 )P (N4 ) In this case, we know P (Nj ) = pj . Also, we can easily deduce that P (E |Nj ) = 1 j . For instance, there is obviously a 1 probability of selecting the eldest child from a family of j = 2 children. 2 Therefore, we evaluate and conclude: P (E ) = 1(0.1) + (1/2)(0.25) + (1/3)(0.35) + (1/4)(0.3) = 0.4167 P (N1 |E ) = P (E |N1 )P (N1 ) (1)(0.1) = P (E ) 0.4167 P (N1 |E ) = 0.24

3.2

We use the same events that we dened in the previous part of the problem. We seek to calculate: P (N4 |E ) = P (E |N4 )P (N4 ) P (E )

We can use P (E ) = 0.4167, which we calculated in the previous part of the problem. We know from the problem statement that P (E |N4 ) = 1/4 and P (N4 ) = p4 = 0.3. Therefore, we evaluate and conclude: 3

P (N4 |E ) =

(0.25)(0.3) 0.4167

P (N4 |E ) = 0.18

Chapter 3, Theroetical Exercise 8

Collaborators: No collaborators.

4.1

For this problem we apply our understanding of conditional probability. We see that our set of relevant events is A, B, C . We observe that by conditioning on C and C c , we get: P (A) = P (A|C )P (C ) + P (A|C c )P (C c ) P (B ) = P (B |C )P (C ) + P (B |C c )P (C c ) From the conditions given in the problem, we know that P (A|C ) > P (B |C ) and P (A|C c ) > P (B |C c ). Therefore, we see: P (A|C )P (C ) > P (B |C )P (C ) P (A|C c )P (C c ) > P (B |C c )P (C c ) P (A|C )P (C ) + P (A|C c )P (C c ) > P (B |C )P (C ) + P (B |C c )P (C c ) Thus, we can conclude that P (A) > P (B ) .

4.2

The presence of the hint in the problem intuitively leads me to search for a counterexample. We will follow the hint provided, assigning the given denitions to events A, B, C . Recall, A := rst dice is a 6 B := second die is a 6 C := the dice sum to 10 Computing, we get that: P (A) = 1/6 P (B ) = 1/6 P (C ) = 3/36 Let us examine the sample space of C . Using ordered pairs to represent the faces of the dice, we see that C = {(6, 4), (5, 5), (4, 6)}. It follows that: P (A|C ) = 1/3 P (B |C ) = 1/3

and, P (A) = P (A|C )P (C ) + P (A|C c )P (C c ) 1/6 = (1/3)(3/36) + P (A|C c )(33/36) P (A|C c ) = 5/33 . Without rigorous computation, we can also determine from the symmetry of the problem/events that P (B |C c ) = P (A|C c ) = 5/33. The problem asks us to prove that P (AB |C ) > P (AB |C c ). We observe that given our event denitions and computed probabilities, P (AB |C ) = 0, because if both dice showed 6, the sum would be 12. We also observe that P (AB |C c ) = 1/33, the event where both dice show 6, given that the sum is not 10. Because we see that in this case, P (AB |C ) > P (AB |C c ), this is a counterexample.

Problem 5

Collaborators: Ariel Schvartzman, Steven Fine.

5.1

First let us dene the events and meanings. V := team wins L := team loses W := players card is white B := players card is black C1 , C2 , C3 := player 1s card, player 2s card, and player 3s card respectively Claim 5.1. P (C1 = X1 |C2 = X2 , C3 = X3 ) = P (C1 = X1 ) = 1 X1 , X2 , X3 {B, W } 2

5.2

Suppose that Player 1 always guesses black and the other two players always abstain. We examine the set of possible card distributions, set S , where each distribution is denoted by the ordered triple (C1 , C2 , C3 ): S = {(B, B, B ), (W, B, B ), (B, W, B ), (B, B, W ), (W, W, B ), (W, B, W ), (B, W, W ), (W, W, W )} where |S | = 8. Next, we examine the subset of S , for which this strategy wins. Because only Player 1s guesses matter, this is the subset of distributions where Player 1s card is black. We get: V = {(B, B, B ), (B, W, B ), (B, B, W ), (B, W, W )} where |V |= 4 and so we see that P (V |Strategy) = 5
4 8

= 0.5.

5.3

We claim that if such a strategy existed, it would win with probability 3/4. There are 8 possible distributions of cards, and 4 possible team actions. We suppose that each of the team actions occurs in two cases. We know that all of the distributions of cards and equal probabilities of occurring. We show a table to illustrate what the strategy might look like: Table 1: Strategy Showing Potential Events and Outcomes Event Index Player 1 Guess Player 2 Guess Player 3 Guess Result iv White White White Lose i White Abstain Abstain Win ii Abstain White Abstain Win iii Abstain Abstain White Win iii Abstain Abstain Black Win ii Abstain Black Abstain Win i Black Abstain Abstain Win iv Black Black Black Lose The table shows us that of the 8 total outcomes, 6 of the outcomes result in a win and 2 of the outcomes result in a loss. Therefore, we conclude that such a strategy would succeed with probability 3/4. We can show this mathematically as well by the laws of total probability. We calculate the theoretical P (V ), the probability of winning. The events are labeled E1 , E2 , E3 , E4 . P (V ) = 0.25P (E1 = V ) + 0.25P (E2 = V ) + 0.25P (E3 = V ) + 0.25P (E4 = V ) 3 = 0.25 1 + 0.25 1 + 0.25 1 + 0.25 0 = 0.75 = 4 We conrm by calculating P (L), which we expect to be 1 . 4 P (L) = 0.25P (E1 = L) + 0.25P (E2 = L) + 0.25P (E3 = L) + 0.25P (E4 = L) = 0.25 0 + 0.25 0 + 0.25 0 + 0.25 1 = 0.25 = 1 4

5.4

Each player observes his two teammates card. Therefore there are 22 = 4 possible outcomes that he can see. These are: M = {(W, W ), (B, W ), (W, B ), (B, B )} In each one of these outcomes, the player will have an assigned action. Let us say that if a player observes (B, B ) or (W, W ) he will guess. If he sees (B, W ) or (W, B ) he will abstain. We can guarantee that at least one of the three players will see either (W, W ) or (B, B ) by the pigeonhole principle (because there are 3 players and only two suits). Therefore, we can guarantee that someone will guess on every outcome. However, when the case of either (W, W, W ) or (B, B, B ) arises, all three players will guess. Let us say that if a player observes (B, B ) he will guess White. If a player observes (W, W ), he will guess Black. Now, I will explicitly show how this strategy plays out in terms of events i-iv as they are given in the problem statement: 6

Table 2: Strategy Showing Card Distributions and Resulting Action of Team Card Distribution Event Index Player 1 Guess Player 2 Guess Player 3 Guess Result (B,B,B) iv White White White Lose (W,B,B) i White Abstain Abstain Win (B,W,B) ii Abstain White Abstain Win (B,B,W) iii Abstain Abstain White Win (W,W,B) iii Abstain Abstain Black Win (W,B,W) ii Abstain Black Abstain Win (B,W,W) i Black Abstain Abstain Win (W,W,W) iv Black Black Black Lose

The table conrms that the strategy meets the conditions set in the problem. We see that: P (win) = 6/8 = 3/4

5.5

The intuition is wrong because the intuition claims that one cannot do better than 50% chance of guessing his card correctly. This implies that there is no strategy to win the game more than half the time, leading to the naive strategy given in (b). However, it turns out that one can in fact use a combination of event-outcomes to create a superior strategy which wins 3/4 of the time. We can leverage the 50% chance by distributing it among the 3 players, such that each player still only wins on half of their guesses, but each player only guesses in 1/2 of the situations. The naive strategy does not account for the fact that the players can bundle their losses and maximize their wins by guessing incorrectly together and guessing correctly alone.

Вам также может понравиться