Вы находитесь на странице: 1из 14

QUESTION BANK UNIT 1 1. A Source is emitting symbols x1 , x2 and x3 with probabilities, respectively 0.6 , 0.3 and 0.1.

What is the entropy of the source? 2. Explain the following: a) b) c) d) e) What is entropy? What is discrete memory less source? What is amount of information? What is mean by one bit? What is information rate?

3. An event having six possible outcomes with probabilities. Find entropy of the system. 4. A discrete memory less source (DMS) X has four symbols x1 ,x2 ,x3 ,x4 with Probabilities P (xi) = 0.4, P(x2 ) = 0.3, P(x3 ) = 0.2, P(x4 ) = 0.1 i. calculate H(x)

ii. find the amount of information contained in the message x1 ,x2 ,x1 ,x3 and x4 ,x3 ,x3 ,x2 ,and compare with the H(x) obtained in part(i)

5. A high resolution black and white TV picture consists of about 2x10 6 picture elements and 16 different brightness levels. Pictures are repeated at the rate of 32 per second. Al the picture elements are assumed to be independent and all level has equal likelihood of occurrence. Calculate the average rate of information conveyed by this TV picture source. 6. The probabilities of the five possible outcomes of an experiment are given as P(x1 ) =1/2, P(x2 ) =1/4, P(x3 ) =1/8, P(x4 ) =P(x5 ) =1/16. Determine the entropy and information rate if there are 16 outcomes per second. 7. An analog signal band limited to 10 KHz is quantized in 8 leve ls of a PCM system with probabilities of 1/4, 1/5, 1/5, 1/10, 1/10, 1/20, 1/20 and 1/20 respectively. Find the Entropy and rate of information 8. An analog signal is band limited to B Hz, sampled at the Nyquist rate, and the samples are quantized into 4-levels. The quantization levels Q 1, Q2, Q3, and Q 4 1 (messages) are assumed independent and occur with probabilities P1 = P2 = 8 3 and P2 = P3 = . Find the information rate of the source. 8

9. Model shown is an example of a source where the probability of occurrence of a symbol depends not only on the particular symbol in question, but also on the two symbols preceding it. A
2 18 7 P2 (1) 18 7 P3 (1) 18 2 P4 (1) 18

/8

1
3

/8 4

/4 B

/4 A
3

P1 (1) 2 /4 B
7

/8
1

B
1

/4

/8

10.

A) Draw tree diagram for the above diagram. B) Calculate the probability of getting one, two symbols. C) Calculate the entropy of each source. D) Find the total entropy H. E) Find G1, G2.

10. Consider an information source modeled by a discrete stationary Mark off random process shown in the figure. Find the source entropy H and the average information content per symbol in messages containing one, two and three symbols. C C

A 1
3

/4

/4 p1 =

P2 =

A) Draw tree diagram for the above diagram. B) Calculate the probability of getting one, two, three symbols.

C) Calculate the entropy of each source. D) Find the total entropy H. E) Find G1, G2 , G3. 11. For the Mark off source shown, cal the information rate.

L 1 p1 =

S L

S 2

S R

/2

P2 =

P3 =

12. The state diagram of the stationary Markoff source is shown below Find (i) the entropy of each state (ii) the entropy of the source (iii) G1 , G2 and verify that G1 G2 H the entropy of the source C 2 P(state1) = P(state2) = P(state3) = 1/3 A A 1 B A 3 B B C

13. Calculate the average information content in the English language assuming that each of the 26 characters in the Alphabet occurs with equal probability. 14. Consider the source X that produces five symbols with probability , , 1/8, 1/16, 1/16. Find the entropy. 15. The international Morse code uses a sequence of dots and dashes to transmit letters of English alphabet. The dash is represented by a current pulse that has duration of 3 units and dot has duration of 1 unit. The probability of occurrence of a dot. a) Calculate the information content of a dot and dash. b) Calculate the average information in the dot dash code. c) Assume that dot loses 1 m sec, which is the same time interval as the pause between the symbols. Find the average rate of information transmission. 16. Define Information. 17. Define coding. 18. Find the Entropy for the messages A B C D E F G H 19. Define conditional probability. 20. What is the difference between information content and average information content? QUESTION BANK UNIT II 1. 2. 3. 4. What is meant by Source encoding? Write Source encoding theorem? Name the two source coding techniques. When is the average information delivered by a source of alphabet size 2 maximum? 5. What is meant by prefix code? 6. Write about data compaction? 7. What is channel redundancy? 8. What is mutual information? 9. Write about channel capacity? 10. Define lossless channel. 11. Define Deterministic channel 12. Define noiseless channel. 13. Explain Shannon-Fanon coding. 14. State the properties of mutual information. 15. What is source coding and entropy coding? 16. What is information theory? 17. What is the channel capacity of a BSC and BEC? 18. What happens when the number of coding alphabet increases? 19. What is channel diagram and channel matrix? 20. Write the expression for code efficiency. 21. Define band width efficiency. 22. Is the Trans information of a continuous system non- negative? If So, why?

23. State Shannon Hartley theorem. (Or)Channel capacity theorem For a continuous channel. 24. What is the channel capacity of the channel having infinite Bandwidth? 25. Define rate of information transmission across the channel. 26. State the channel coding theorem for a discrete memory less Channel. 27. A discrete memory less source is capable of transmitting three 28. Distinct symbols m0 , m1 and m2.Their probabilities are respectively. Calculate the source entropy. 29. Prove that the mutual information of a channel is symmetric. That means I(X; Y) =I(Y; X) 30. Prove that the following I(X; Y) =H(X) +H(Y)-H(X, Y) 31. Define mutual information and write the properties of mutual Information? 32. Verify the following expression.H(x, y) =H(x/y) +H(y) 33. Define channel capacity and write the channel capacity for Different special channels. 34. Write notes on binary symmetric channel, discrete channel Capacity, continuous channel capacity and channel capacity Theorem. 35. What is prefix? 36. Explain prefix code with the help of example 37. Derive Shannons channel capacity theorem 38. Give the significance of Shannons channel capacity theorem.

39. Consider the BSC with P(x1)= as shown in fig. Show that mutual information is I(X,Y)=H(Y)+p log p+(1-p)log(1-p) (1-p) p p (1-p) 40. A zero memory source emits messages x1, x2,x3 with probabilities 0.9,0.02 and 0.08 resp. Find the optimum binary code for this source as well as for its 2 ndOrder extension (i.e. N=2). Determine code efficiency in each case. 41. Interpret the following entropies H(X),H(Y),H(X|Y),H(Y|X)and H(XY)

42. Find the channel capacity of the BSC shown in figure: Pe X1 Y1 P


e

Pe X2 Pe Y2

43. Consider a binary input ,output channel shown below 0.8 X1 Y1 0.2 X2 X3 1 0.3 0.7 Y3 Y2

44. Find H(X),H(Y),H(X|Y),H(Y|X)and H(XY) 45. A discrete memory less source has an alphabet of seven symbols with probabilities 0.25, 0.25, 0.125, 0.125, 0.125, 0.0625, 0.0625 resp. Compute Huffman code for this source; calculate entropy, avg. codeword length & variance of this code.

46. Explain the source coding theorem. 47. A discrete memory less source has an alphabet of seven symbols with probabilities 0.3, 0.20, 0.16, 0.14, 0.10, 0.02, 0.08 resp. Compute Huffman code for this source; calculate entropy, avg. codeword length & variance of this code.

48. Derive the expression for condition of maximum entropy. 49. A zero memory source emits messages x1, x2with probabilities 0.8, 0.2 resp. Find the optimum binary code for this source as well as for its 2 nd and 3rdOrder extension (i.e. N=2 & N=3). Determine code efficiency in each case. 50. A zero memory source emits messages x1, x2, x3 with probabilities 0.45, 0.35 and 0.20 respectively. Find the optimum (Huffman) binary code for this source as well as for its 2ndOrder extension (i.e. N=2). Determine code efficiency in each case. Unit-3 1. The generator matrix for a (6,3) block code is given below: G= { 100 |011 010 |101 001 |110 } Find the code words, weights of the code words and the minimum distance of the code.

2. Design a linear block code with a minimum distance of three and a message block size of eight bits.

3. A (6,3) linear block code is generated according to the generating matrix G= { 100 |101 010 |011 001 |110 } For a particular code word transmitted, the received code word is 100011. Find the corresponding data word transmitted.

4.

A) What are the different types of codes? B) Explain the linear block code in detail.

5. Explain the error detection and error correction cap abilities of linear block codes. 6. A system has a Bandwidth of 3 kHz and an S/N ratio of 29 dB at the input to the receiver. Find a) The information carrying capacity b) Capacity of the channel if its Bandwidth is doubled while the transmitted signal power remains constant.

7. For a system, the Bandwidth is 4 kHz and S/N ratio is 14. If the Bandwidth is increased to 5 kHz, find the required S/N ratio to have the same channel capacity and find the percentage change in signal power. 8. Consider an AWGN channel with 4 kHz bandwidth and the noise PSD is n/2 = 10-12 w/Hz. The signal power required at the receiver is 0.1 mW. Find the capacity of the channel.

9. Write the expression of differential entropy a random variable X. Find the probability density function f(x) for which H(X) is maximum.

10. Show that the channel capacity of an ideal AWGN channel with infinite bandwidth is given by C= 1.44 S/n (b/s) where S is the average signal power and n/2 is the power spectral density of white Gaussian noise.

11. An analog signal having 4-kHz bandwidth is sampled at 1.25times the Nyquist rate and each sample is quantized into one of 256 equally likely levels. Assume that the successive samples are statistically independent. a) What is the information rate of this source? b) Can the output of this source be transmitted without error over an AWGN channel with a bandwidth of 10 kHz and an S/N ratio of 20dB? c) Find the S/N ratio required for error- free transmission for part (b). d) Find the bandwidth required for an AWGN channel for error- free transmission of the output of this source if the S/N ration is 20 dB. 12. A) S.T. C={0 0 0, 1 1 1} is a linear code. B) S.T. C={0 0 0 ,0 0 1,1 0 1} is not a linear code. C) S.T. the Hamming distance measure has the following properties i) d(a,b)0 with = 0 if and only if a=b ii) d(a,b)=d(b,a) 13. P.T. the minimum distance of a linear block code is the smallest Hamming weight of the nonzero code vectors in the code. 14. For a (6,3) systematic linear block code, the three parity-check bits c4,c5 and c6 are formed from the following equations: c4=d1 +d3 ; c5= d1 +d2 +d3 ; c6= d1+ d2 [ where + refers to modulo-2 addition] a) Write down the generator matrix G b) Construct all possible code words. c) Suppose that the received word is 010111. Decode this received word by finding the location of the error and the transmitted data bits. 15. A parity-check code has the parity-check matrix H = { 101|100 110 |010 011|001 } a) Determine the generator matrix G. b) Find the code word that begins with 101.. c) Suppose that the received word is 110110. Decode this received word. 16. Show that the syndrome S is the sum (modulo 2) of those rows of matrix HT corresponding to the error location in the error pattern.

17. Show that all error vectors that differ by a code vector have the same syndrome.

18. A friend of yours says that he can design a system for transmitting the output of a minicomputer to a line printer operating at a speed of 30 lines/minute over a voice grade telephone line with a bandwidth of 3.5 kHz, and S/N =30 dB. Assume that the line printer needs 8 bits of data per character and prints out 80 characters per line. Would you believe him? 19. Consider an (n,k) linear block code with generator matrix G and parity-check matrix H. The (n, n-k) code generated by H is called the dual code of the (n,k) code. Show that matrix G is the parity-check matrix for the dual code.

20. A code consists of code words 1101000, 0111001, 0011010, 1001011, 1011100, and 0001101. If 1101011 is received, what is the decoded code word? Unit-4: 1. Show that the code C= {0000, 0101, 1010, 1111} is a linear cyclic code. 2. Show that the code C= {000, 100, 011, 111} is not cyclic code.

3. Consider given Cyclic code C= {0000, 0101, 1010, 1111}. Find the generator polynomial g(x) for C and show that every code polynomial is a multiple of g(x). 4. Show that the nonzero code polynomial of minimum degree in a cyclic code C is unique.

5. Find a generator polynomial g(x) for a (7, 4) cyclic code. 6. Consider a (7, 4) cyclic code with g(x) =1+x+x3 . a) Let data word d= (1010). Find the corresponding code word. b) Let the code word c= (1100101). Find the corresponding data word. 7. Construct the syndrome evaluator table for the (7, 4) cyclic code with g(x) =1+x+x3 .

8. Verify the following theorem. Let C be a cyclic code with minimum distance dmin . Every error polynomial of weight less than d min /2 has a unique syndrome polynomial. 9. Consider a (7, 4) cyclic code with g(x) =1+x+x3 . a) Let the sequence = (1110011) is received. Find the data word sent.

10. Let g(x) be the generator polynomial of a cyclic code C. Find a scheme for encoding the data sequence (d0 ,d1 ,.,dk-1 ) into an (n,k) systematic code C. 11. Consider a (7, 4) cyclic code with generator polynomial g(x) = 1+x+x3 . data word d= (1010). Find the corresponding systematic code word. Let

12.

Consider a (7, 4) cyclic code with g(x) =1+x+x3 . Find a generator matrix G for C and find the code word for d= (1010).

13. Consider an (n, k) cyclic code C with generating polynomial g(x). Find a procedure for construction of (n-k) X n parity check matrix H.

14. Find the parity-check polynomial h(x) and the parity-check matrix H for the (7,4) cyclic code C with g(x)= 1+x+x3 . 15. A (15,5) linear cyclic code has a generator polynomial g(x)= 1+x+x2 +x4 +x5 +x8 +x10 a) Draw block diagrams of an encoder and syndrome calculator for this code. b) Find the code polynomial for the message polynomial for the message polynomial D(x) = 1+x2 +x4 (in a systematic form). c) Is V(x) = 1+x4 +x6 +x8 +x14 a code polynomial? If not, find the syndrome of V(x).

16. The generator polynomial for a (15, 7) cyclic code is g(x)=1+x4 +x6 +x7 +x8 a) Find the code vector (in systematic form) for the message polynomial D(x)= x2 +x3 +x4 . b) Assume that the first and last bits of the code vector V(x) for D(x) = x2 +x3 +x4 suffer transmission errors. Find the syndrome of V(x).

17. Consider a (15,9) cyclic code generated by g(x) =1+x3 +x4 +x5 +x6 This code has a burst-error-correcting ability q=3. Find the burst-correcting efficiency of this code.

18. Consider a (7,3) cyclic code generated by g(x) = 1+x2 +x3 +x4 a) Find the burst-error-correcting ability of this code. b) Find the burst-error-correcting efficiency of this code. c) Construct an interlaced code derived from the (7, 3) code given above that can correct bursts of length up to 10 using a message block size of 15 bits. Unit-5:

1. Explain the distance properties of convolution encoder. 2. Explain the sequential decoding of convolution code.

3. Explain decoding of Reed-Solomon Code. 4. Explain the operation of convolution code in both time domain and transform domain with an example.

5. Explain what is meant by interleaved convolution code. 6. A rate , constraint length 3, binary convolution encoder is generated by the following sequences g(1) =(111), g(2) =(101) i) Draw the encoder circuit ii) Draw the state diagram and trellis diagram.

7. Write brief notes on shortened cyclic codes. 8. (2, 1, 2) Convolutional Encoder is shown in following figure.

Represent state transition table for given convolutional encoder.

9. Draw the state diagram and the augmented state diagram for computing the complete path enumerator function for the encoder shown in above problem. 10. Suppose the code word r = (01, 10, 10, 11, 01, 01, 11), from the encoder shown in question no. 8 is received through a BSC. Show the path traced using viterbi algorithm and find out the estimate of the transmitted code word. 11. Write short notes on a) R-S codes b)golay codes 12. Write short notes on following: *Burst error correcting codes. 13.

i) Find the impulse response of the above encoder. ii) Using the impulse response, determine how to get output code word for any arbitrary input data. 14. Find the Generator matrix G for the convolutional encoder shown in above problem. 15. Determine if the code is catastrophic or noncatastrophic using following convolutional encoder.

Вам также может понравиться