Вы находитесь на странице: 1из 6

Advanced Digital Communication

Information and Entropy


1. Let p denote the probability of some event. Plot the amount of information gained by
the occurrence of this event for 0 p 1 . [9.1 Haykin]

2. A source emits one of four symbols s o , s1 , s 2 and s 3 with probabilities 1/3, 1/6, 1/4,
and 1/4, respectively. The successive symbols emitted by the source are statistically
independent. Calculate the entropy of the source. [9.3 Haykin]
3. The sample function of a Gaussian process of zero mean and unit variance is
uniformly sampled and then applied to a uniform quantizer having the input-output
amplitude characteristic shown in Fig.1. Calculate the entropy of the quantizer output.
[9.5 Haykin]
Output
1.5
-1

0.5
-0.5

Input

-1.5

Given:

1
2

ey

/2

if x = 0
0.5
dy =
0.1611 if x = 1

4. Consider a discrete memoryless source with source alphabet S = {s1 , s 2 , s3 } and

source statistics {0.7, 0.15, 0.15}.


(a) Calculate the entropy of the source.
(b) Calculate the entropy of the second-order extension of the source. [9.7 Haykin]

Mutual Information and Channel Capacity


5. A nonsymmetric binary channel is shown in Figure 1.
(a) Find P(Y = 0) and P (Y = 1) when P ( X = 0) = 1 / 4, P ( X = 1) = 3 / 4, = 0.75,
and = 0.9.
(b) Find H ( X ), H (Y ), H ( X Y ), and H (Y X ) .
(c) Find the rate of information transmission over the channel.
0

0
1

rs = 1000 sybmol/sec

Figure 1
6. Find the rate of information transmission of the discrete channel shown in Figure 2.
1
X 2

0 .8
0 .8

0 .1
0.2
0.2
0 .1

rs = 1000 symbol/sec
2 Y

3
0 .8
P ( X = 1) = P( X = 2) = P( X = 3) = 1 / 3
3

Fig. 2
7. Two binary symmetric channels are connected in cascade. Find the overall channel
capacity of the cascaded connection, assuming that both channels have the same
transition probability as shown below. [Haykin9.22]

8. An analog signal has a 4 kHz bandwidth. The signal is sampled at 2.5 times the
Nyquist rate and each sample is quantized into one of 256 equally likely levels.
Assume that the successive samples are statistically independent.
(a) What is the information rate of this source?

(b) Can the output of this source be transmitted without errors over a Gaussian
channel with a bandwidth of 50kHz and S/N ratio of 23 dB?
(c) What will be the output of the source without errors if the S/N ratio is 10 dB?
9. A black-and-white television picture may be viewed as consisting of approximately
3105 elements, each of which may occupy one of 10 distinct brightness levels with
equal probability. Assume that (1) the rate of transmission is 30 picture frames per
second, and (2) the signal-to-noise ratio is 30 dB. Using the information capacity
theorem, calculate the minimum bandwidth required to support the transmission of
the resulting video signal.
(Note: As a matter of interest, commercial television transmissions actually employ a
bandwidth of 4.2 MHz, which fits into an allocated bandwidth of 6MHz.)
[Haykin9.31]

Solution
I = log 2 p
1.

bits

2. H = pi log 2 pi = 1.959 bits


i =0
3

3. H = p( xi ) log 2 p( xi )
i =0

where xi denotes a representation level of quantizer.


p ( x0 ) = p ( x3 ) = p (1 < input < ) =

p ( x1 ) = p ( x 2 ) = p (0 < input < 1) =

y2
1
exp dy = 0.1611
2
2
y2
1
exp dy = 0.5 0.1611 = 0.3389
2
2

Therefore, H = 1.91 bits


2

4. H = p ( xi ) log 2 p( xi ) = 1.079 bits/symbol


i =0

H ( S 2 ) = 2 H ( S ) = 2.158 bits/symbol

5 (a)

P (Y = 0) = P(Y = 0 | X = 0) P( X = 0) + P(Y = 0 | X = 1) P( X = 1) = 0.2625


P (Y = 1) = 0.7375
1

(c) H ( X ) = P ( X = i ) log 2 P ( X = i ) = 0.811 bits/symbol


i =0

H (Y ) = P(Y = i ) log 2 P(Y = i ) = 0.831 bits/symbol


i =0

H (Y | X ) = P( X = i ) H (Y | X = i ) = 0.555 bits/symbol
i =0
1

H ( X | Y ) = P(Y = i ) H ( X | Y = i ) = 0.536 bits/symbol


i =0

(c)

I ( X ; Y ) = H ( X ) H ( X | Y ) = 0.276 bits/symbol
or I ( X ; Y ) = H (Y ) H (Y | X ) = 0.277 bits/symbol
Therefore, I ( X ; Y )rs = 276 bits/second

6.
P (Y = 1) = P(Y = 1 | X = 1) P( X = 1) + P(Y = 1 | X = 2) P( X = 2) + P(Y = 1 | X = 3) P( X = 3) = ...
P (Y = 2) = P (Y = 2 | X = 1) P( X = 1) + P(Y = 2 | X = 2) P( X = 2) + P(Y = 2 | X = 3) P( X = 3) = ...
P (Y = 3) = P(Y = 3 | X = 1) P( X = 1) + P(Y = 3 | X = 2) P( X = 2) + P(Y = 3 | X = 3) P( X = 3) = ...
3

H (Y ) = P (Y = i ) log 2 P (Y = i ) = ...
i =1

H (Y | X ) = P( X = i ) H (Y | X = i ) = ...
i =1

I ( X ; Y ) = H (Y ) H (Y | X ) = ... bits/symbol
I ( X ; Y ) = rs I ( X ; Y ) = ... bits/second

7.

The BSC of a cascaded channel becomes

.
Using the result of the BSC, we have

8. (a) Sampling rate f s = 2.5 2 4kHz = 20kHz


256

Source entropy = p log 2 pi

pi =

i =1

1
256

Therefore, H ( X ) = 8 bits/symbol
Information rate = f s H ( X ) = 160k bits/second
(b) Channel Capacity = B log 2 (1 + S / N ) = 50k log 2 (1 + 199.5) = 382kbps
As Channel Capacity > Information rate, the output of this source can be
transmitted without errors.
(c) /
10

9. Source entropy = pi log 2 pi = 3.32 bits/symbol


i =1

Information rate = 3 10 5 H ( X ) 30 = 29.9Mbps


Let C = Information rate, we have
29.9 10 6 = B log 2 (1 + 1000)
B = 3MHz

Вам также может понравиться