Вы находитесь на странице: 1из 2

Cheat Sheet

Some Theorems Involving Sets:


(1)
(2)
(3)
(4)
(5)
(6)
(7)

If A B and B C , then A C

(8)
AB =BA
(9)
A (B C) = (A B) C = A B C (10)
AB =BA
(11)
A (B C) = (A B) C = A B C (12a)
A (B C) = (A B) (A C)
(12b)
A (B C) = (A B) (A C)
(13)

A B = A B0
If A B , then A0 B 0 or B 0 A0
A = A, A =
A U = U, A U = A
(A B)0 = A0 B 0
(A B)0 = A0 B 0
A = (A B) (A B 0 )

Fundamental principle of counting: If an operation consists of k steps, of which the rst can be

Permutations and Combinations:

done in n1 ways, for each of these the second step can be done in n2 ways, for each of the rst two the
third step can be done in n3 ways, an so forth, then the whole operation can be done in n1 n2 nk
ways.

n!
The number of permutation of n distinct objects taken r at a time is n Pr = (nr)!
for r =
0, 1, 2, , n. The number of permutations of n objects of which n1 are of one kind, n2 are of a second kind, , nk are of a k th type, and n = n1 + n2 + + nk is n Pn1 ,n2 , ,nk = n1 !nn!
.
2 !nk !

The number of combinations of n distinct objects taken r at a time is


0, 1, 2, , n.

Binomial coecient: (x + y) = P x y .
Some Important Theorems on Probability: Let A
n

S , then

n
n
r=0 r

n
r

n!
r!(nr)!

for all r =

nr r

(i = 1, 2, 3, n) be events of the sample space

(1) If A1 A2 then P (A1 ) P (A2 )


(2) For every event A, 0 P (A) 1
0
0
(3) If A is the complement of A, then P (A ) = 1 P (A)
(4) P () = 0
(5) If A and B are any two events, then P (A B) = P (A) + P (B) P (A B). More generally, if A1 ,
A2 , A3 are any three events, then P (A1 A2 A3 ) = P (A1 ) + P (A2 ) + P (A3 ) P (A1 A2 ) P (A2
A3 ) P (A3 A1 ) + P (A1 A2 A3 ).

Conditional probability of B given A: P (B|A) =

Independent Events: Events A and B are independent if and only if P (A B) = P (A)P (B), or in

or P (A B) = P (A)P (B|A). For any


three events A1 , A2 , A3 , we have P (A1 A2 A3 ) = P (A1 )P (A2 |A1 )P (A3 |A1 A2 ).
P (AB)
P (A)

other words P (B|A) = P (B). If A and B are independent, then A and B 0 are also independent.

Bayes' Theorem:

If B1 , B2 , , Bk constitute a partition of a sample space S and P (Bi ) =


6 0 for
)P (A|Br )
i = 1, 2, , k , then for any event A in S such that P (A) 6= 0 P (Br |A) = PkP (BPr(B
for r =
i )P (A|Bi )
i=1
1, 2, , k .

Probability distributions:
I. Discrete case.
Discrete probability distributions (or probability functions): If X is a discrete random

variable, the function given by f (x) = P (X = x) for each x within the range of X is called the
probability
distribution of X . f (x) is a probability distribution if and only if (1) f (x) 0, and (2)
P
x f (x) = 1.
If X is a discrete random variable,
P
the function given by F (x) = P (X x) = tx f (t) for < x < where f (t) is the value of
the probability distribution of X at t, is called the distribution function of X . If F (x) is a distribution
function then (1) F () = 0, F () = 1, and (2) if a b, then F (a) F (b) for any real numbers a
and b.

Distribution functions for discrete random variables:

If the range of a random variable X consists of the values x1 < x2 < x3 < < xn , then f (x1 ) =
F (x1 ) and f (xi ) = F (xi ) F (xi1 ) for i = 2, 3, , n.

II. Continuous case.


Continuous probability density functions: A function with values f (x), dened over the set

of all real numbers, is called


X if and
R b a probability density function of the continuous random variable
R
only if P (a < X < b) = a f (x)dx. Where f (x) has the properties (1) f (x) 0, and (2) f (x)dx = 1.
If X is a continuous random
variable
R xand the value of its probability density at t is f (t), then the function given by F (x) = P (X
x) = f (t)dt is called the distribution function of X . If F (x) is a distribution function then (1)
F () = 0, F () = 1, and (2) if a b, then F (a) F (b) for any real numbers a and b.
If f (x) and F (x) are the values of the probability density and the distribution of the continuous
random variable X at x, then P (a < X < b) = F (b) F (a) for any real a and b with a b, and
f (x) = dFdx(x) where the derivative exists.

Distribution functions for continuous random variables:

Joint distributions:
Joint probability distributions: If X and Y

are discrete random variables, the function given


by f (x, y) = P (X = x, Y = y) for each pair of values (x, y) within the range of X and Y is called the
joint probability distribution
of X and Y . f (x, y) is a joint probability distributions if and only if (1)
P P
f (x, y) 0 and (2) x y f (x, y) = 1.
P IfPX and Y are discrete random variables, the function given by
F (x, y) = P (X x, Y y) = sx ty f (s, t) for < x < and < y < where f (s, t)
is the value of the joint probability distribution of X and Y at (s, t), is called the joint distribution
function of X and Y . If F (x, y) is the value of the joint distribution function of two discrete random
variables X and Y at (x, y), then (1) F (, ) = 0, (2) F (, ) = 1, (3) if a < b and c < d, then
F (a, c) F (b, d).
If X and Y are discrete random variables
P and f (x, y) is the
value of their joint probability distribution at (x, y), the function given by g(x) = y f (x, y) for each x
within the
P range of X is called the marginal distribution of X . Correspondingly, the function given by
h(y) = x f (x, y) for each y within the range of Y is called the marginal distribution of Y .
If f (x, y) is the value of the joint probability distribution of the discrete
random variables X and Y at (x, y), and h(y) is the value of the marginal distribution of Y at y , the
(x,y)
function given by f (x|y) = fh(y)
(h(y) 6= 0) for each x within the range of X , is called the conditional
distribution of X given Y = y . Correspondingly, if g(x) is the value of the marginal distribution of X
(x,y)
at x, the function w(y|x) = fg(x)
(g(x) 6= 0) for each y within the range of Y , is called the conditional
distribution of Y given X = x.
Suppose that X and Y are discrete random variables. If the
events X = x and Y = y are independent events for all x and y , then we say that X and Y are
independent random variables. In such case, P (X = x, Y = y) = P (X = x) P (Y = y) or equivalently
f (x, y) = f (x|y) h(y) = g(x) h(y). Conversely, if for all x and y the joint probability function f (x, y)
can be expressed as the product of a function of x alone and a function of y alone (which are then the
marginal probability functions of X and Y ), X and Y are independent. If, however, f (x, y) cannot be
so expressed, then X and Y are dependent.

Joint distribution functions:

Marginal probability functions:


Conditional distribution:

Independent random variables:

Mathematical expectation:
I. Discrete
case. Let f (x) be the probability distribution of X then the expected value of X is
P

E(X) = x xf (x). Let X be a discrete


Prandom variable with probability distribution f (x), the expected
value of g(X) is given by E[g(X)] = x g(x)f (x).
Let f (x) be the probability density function of X then the expected value
R
of X is E(X) =
xf (x)dx. Let X be a continuous random variable with probability density f (x),
R
the expected value of g(X) is given by E[g(X)] =
g(x)f (x)dx.

II. Continuous case.

Вам также может понравиться