Вы находитесь на странице: 1из 2

Coding Theory (EEL710): Tutorial 1

Note: You have to submit the homework on 12.08.13 at 11 AM in the class room. Late
submission is not allowed.

1) Consider tossing of a coin. Let X be a random variable which denotes number of
tosses required until the first head appears. Find
i) Entropy H(X) with a fair coin. (3)
ii) Entropy H(X) with an unfair coin with p as probability of occurring a
head. (4)

2) Let a random variable X with probability mass function as
() = {
1/4, = 0,1
1/2, = 2

Y and Z are two random variables generated as follows. When X = 0, we have Y
= Z = 0; For X = 1, then Y = 1, Z = 0; when X = 2, we have Z = 1 while Y is
randomly chosen from 0 and 1 with equal probability. Find the values of the
following quantities: H(X), H(Y), H(Z), H(Y|X), H(X, Y ), H(X|Y ), H(X, Z), H(X|Z),
H(Y, Z), H(Z|Y ). (20)

3) Let X, Y, and Z be joint random variables. Prove the following inequalities and
find conditions for equality.
i) H(X, Y| Z) H(X | Z). (2)
ii) I (X,Y; Z) I (X ; Z). (2)
iii) H(X,Y,Z) H(X,Y) H(X,Z) H(X). (2)
iv) I (X;Z|Y) I (Z; Y|X) I (Z; Y) + I (X; Z). (2)

4) A DMS source outputs the following eight symbols {x1, x2, x3, x4, x5, x6, x7, x8}
with respective probabilities {0.35, 0.26, 0.19, 0.07, 0.04, 0.04, 0.03, 0.02}.
Determine the Huffman code by taking
i) two symbols at a time. (4)
ii) three symbols at a time. (4)

5) The below figure shows a non-symmetric binary channel. Prove that in this
case I(X,Y)=[q+(1pq)](p)(1)(q) where the function (p) =

2
1

+(1 )
2
1
1
. (7)

Вам также может понравиться