Вы находитесь на странице: 1из 3

Detection and Estimation Theory

(H/W-1)

Jan. 12, 2018

Q1: Write the axioms of a sigma algebra.

Q2: Write the axioms of probability measure.

Q3: Write the definition of a random variable.

Q4: Let (Ω, F, P) be a probability space. Let (Γ, G) be a measurable space.


Let X be a random variable X : Ω → Γ. Assume that Γ = R and G = B(R).
Define the cumulative distribution function (c.d.f) FX (x) of X.

Q5: Write the properties of c.d.f FX (x) of a random variable X defined


as in Q4.

Q6: Show that c.d.f FX (x) is a nondecreasing function in x, i.e., ∀ x2 ≥


x1 , FX (x2 ) ≥ FX (x1 ).

Q7: Let X be a random variable defined as in Q4. Show that

P({ω ∈ Ω : x1 < X(ω) ≤ x2 }) = FX (x2 ) − FX (x1 ).

Q8: Let X be a random variable defined as in Q4. We say that the function
pX (x) is probability density function of X if c.d.f FX (x) can be written as
Z x
FX (x) = pX (x)dx.
−∞
∂FX (x)
We say that pX (x) is derivative of FX (x), i.e., pX (x) = ∂x .

(a) Show that pX (x) is a nonnegative function, i.e., pX (x) ≥ 0,


(b) Show that
Z x2
P({ω ∈ Ω : x1 < X(ω) ≤ x2 }) = FX (x2 ) − FX (x1 ) = pX (x)dx.
x1

Definition: Let (Ω, F, P) be a probability space. Let (Γ, G) be a mea-


surable space. Assume that Γ = {a1 , a2 , · · · , aN }, i.e., a finite set, and
G = 2Γ (power set). Let X be a discrete random variable X : Ω → Γ. The
probability mass function (p.m.f) pX (x) of X is defined as

pX (an ) = P({ω ∈ Ω : X(ω) = an }) ∀n = 1, 2, · · · , N.

Q9: Let (Ω, F, P) be a probability space. Let (Γ, G) be a measurable space.


Assume that Γ = R2 and G = B(R2 ). Let Z = [X, Y ]T be a vector random
variable Z : Ω → Γ.
(a) Define the joint cumulative distribution function (c.d.f) FXY (x, y) of X
and Y ,
(b) Define the joint probability density function (p.d.f) pXY (x, y) as in Q8.

Q10: Let FXY (x, y) and pXY (x, y) be the joint c.d.f and joint p.d.f, re-
spectively, defined as in Q9.
(a) Show that FXY (x, +∞) = FX (x) and FXY (+∞, y) = FY (y),
(b) Show that
Z +∞
pX (x) = pXY (x, y)dy,
Z
−∞
+∞
pY (y) = pXY (x, y)dx.
−∞

Note: Q10(b) holds true when X and Y are discrete random variables and
pXY (x, y) is joint p.m.f.

Definition: Let pXY (x, y) be defined as in Q9. The conditional proba-


bility density functions are defined as

pXY (x, y)
pY /X (y/x) = when pX (x) > 0,
pX (x)
pXY (x, y)
pX/Y (x/y) = when pY (y) > 0.
pY (y)

Definition: Let X and Y be random variables defined as in Q9. We say


that X and Y are independent if

FXY (x, y) = FX (x)FY (y),

an equivalent definition

pXY (x, y) = pX (x)pY (y).

Q11: Let (Ω, F, P) be a probability space. Let X be a real valued discrete


random variable X : Ω → {−1, 1}. Let pX (−1) = P({ω ∈ Ω : X(w) =
−1}) = 21 and pX (+1) = P({ω ∈ Ω : X(w) = +1}) = 21 . Let Y be a real
valued (normal) Gaussian random variable(i.e., Y ∼ N (0, 1)) defined on the
same probability space (Ω, F, P). Assume that X and Y are independent.
Let Z = X + Y be a random variable.
(a) Find the conditional cumulative distribution functions (c.d.f) FZ/X (z/{ω ∈
Ω : X(w) = −1}) and FZ/X (z/{ω ∈ Ω : X(w) = +1}),
(b) Find the cumulative distribution function (c.d.f) FZ (z),
(c) Find the conditional probability density functions (p.d.f) pZ/X (z/{ω ∈
Ω : X(w) = −1}) and pZ/X (z/{ω ∈ Ω : X(w) = +1}),
(d) Find the probability density function (p.d.f) pZ (z).

Q12: Let X and Y be binary valued (discrete) random variables defined


on the same probability space (Ω, F, P). Let pX (0) = P({ω ∈ Ω : X(w) =
0}) = 1/2 and pX (1) = P({ω ∈ Ω : X(w) = 1}) = 1/2. Let pY (0) = P({ω ∈
Ω : Y (w) = 0}) = (1 − α) and pY (1) = P({ω ∈ Ω : Y (w) = 1}) = α where
0 ≤ α ≤ 1. Assume that X and Y are independent. Let Z = X XOR Y .
(a) Find pZ/X (0/0) = P({ω ∈ Ω : Z(w) = 0}/{ω ∈ Ω : X(w) = 0}),
pZ/X (1/0), pZ/X (0/1), and pZ/X (1/1),
(b) Find P({ω ∈ Ω : Z(w) = 0}) and P({ω ∈ Ω : Z(w) = 1}).

Вам также может понравиться