Академический Документы
Профессиональный Документы
Культура Документы
INFORMATION THEORY
Answer the quiz questions
1
5. M messages are equally likely and independent ,probability of occurrence of
each message will be points: 1
(a)1+M
(b)1=M
(c)1/M
(d)1>M
6. If there are M equally likely and independent messages ,then the amount of
information carried by each message will be points: 1
(a)N bits
(b) N+1
bits
(c) N-1 bits
(d) none
2
9. Information rate is defined as points: 1
(a) Information per unit
time
(b) Average number of bits of information per
second
(c)
rH
(d) All of the
above
11. Calculate ENTROPY for certain and most rare message points: 1
(a)1,0
(b)0,0
(c)1,1
(d)none
(d)None of the
above
3
14. The relation between entropy and mutual information is points: 1
(a) I(X;Y) = H(X) - H(Y)
(b)I(X;Y) = H(X/Y) -
H(Y/X)
(c) I(X;Y) = H(X) -
H(X/Y)
(d)I(X;Y) = H(Y) - H(X)
16. The mutual information is related to the joint entropy H(X,Y) is points: 1
(a) H(X,Y)
(b)H(X)+H(Y)-H(X,Y)
(c)H(X)+H(Y)+H(X,Y)
(d) None
4
19. Entropy, H = points: 1
(a)Total information / Number of
messages
(b)Total information >Number of
messages
(c)Total information <Number of
messages
(d)Total information =Number of
messages