Вы находитесь на странице: 1из 14

KL UNIVERSITY ECE DEPARTMENT SUBJECT:- INFORMATION THEORY AND CODING(ELECTIVE)09EC340 YEAR:- III YEAR I SEMESTER QUESTION BANK UNIT

1 objective questions: 1. A Source is emitting symbols x1, x2 and x3 with probabilities, respectively 0.6, 0.3 and 0.1.What is the entropy of the source? 2. Define entropy & information rate? 3. An event having six possible outcomes with equal probabilities. Find entropy of the system. 4. A discrete memory less source (DMS) X has four symbols x1,x2,x3,x4 with Probabilities P (x1) = 0.4, P(x2) = 0.3, P(x3) = 0.2, P(x4) = 0.1 i. calculate H(x) ii. Find the amount of information contained in the message x1,x2,x1,x3 and x4,x3,x3,x2,and compare with the H(x) obtained in part(i) 5. A high resolution black and white TV picture consists of about 2x106 picture elements and 16 different brightness levels. Pictures are repeated at the rate of 32 per second. Al the picture elements are assumed to be independent and all level has equal likelihood of occurrence. Calculate the average rate of information conveyed by this TV picture source. 6. The probabilities of the five possible outcomes of an experiment are given as P(x1) =1/2, P(x2) =1/4, P(x3) =1/8, P(x4) =P(x5) =1/16. Determine the entropy and information rate if there are 16 outcomes per second. 7. An analog signal band limited to 10 KHz is quantized in 8 levels of a PCM system with probabilities of 1/4, 1/5, 1/5, 1/10, 1/10, 1/20, 1/20 and 1/20 respectively. Find the Entropy and rate of information. 8. Calculate the average information content in the English language assuming that each of the 26 characters in the Alphabet occurs with equal probability. 9. A binary source is emitting an independent sequence of 0''s and 1''s with probabilities p and (1-p), respectively. Plot the entropy of this source versus p(o<p<1). 10. John either drives or takes the train to work. If he drives to work, then the next day he takes the train with probability 0.2. On the other hand, if he takes the train to work, then the next day he drives with probability 0.3.Find out how often, in the long run, he drives to work.

subjective questions:1. The international Morse code uses a sequence of dots and dashes to transmit letters of English alphabet. The dash is represented by a current pulse that has duration of 3 units and dot has duration of 1 unit. The probability of occurrence of a dot. a) Calculate the information content of a dot and dash. b) Calculate the average information in the dot dash code. c) Assume that dot loses 1 m sec, which is the same time interval as the pause between the symbols. Find the average rate of information transmission 2. A card is drawn from a deck of playing cards. A) You are informed that the card you draw is a spade. How much information did you receive (in bits)? B) How much information do you receive if you are told that the card that you drew is an ace? C) How much information do you receive if you are told that the card you drew is an ace of spades? Is the information content of the message ace of spades the sum of the information contents of the messages spade and ace? 3. For a source emitting symbols in independent sequences, show that the source entropy is maximum when the symbols occur with equal probabilities.

4. The probability of occurrence of the various letters of the English alphabet are given below: a-o.081,b-o.016,c-0.032,d-0.037,e-0.124,f-0.023,g-0.016, h-0.051,i-0.072,j-0.001,k-0.005,l0.040,m-0.022,n-0.072,o-0.079p-0.023,q-0.002.r-0.06 s-0.066,t-0.096,u-0.031,v-0.009,w0.020,x-0.002,y-0.019,z-0.001 a) What letter conveys the maximum amount of information? b) What letter conveys the minimum amount of information? c) What is the entropy of English text if you can assume that letters are chosen independently to form words and sentences (not a realistic assumption!). d) Calculate information of all letters.

5. Use the following data to give answers for questions from 1 to 3. This is a case in manipulating conditional probabilities. Assume that the probability of being male is p(M) = 0.5 and so likewise for being female p(F) = 0.5. Suppose that 20% of males are T (i.e. tall): p (T|M) = 0.2; and that 6% of females are tall: p (T|F) = 0.06. We can Calculate the probability that if somebody is tall (meaning taller than 6 ft or whatever), that person must be male, means p (M|T). 1. If you know that somebody is male, how much information do you gain (in bits) by learning that he is also tall? 2. How much do you gain by learning that a female is tall? 3. Finally, how much information do you gain from learning that a tall person is female?

6. Case: - A psychologist makes the following assumptions concerning the behavior of mice subjected to a particular feeding schedule. For any particular trial, 80 percent of the mice that went right on a previous experiment will go right on this trial, and 60 percent of those mice that went left on the previous experiment will go right on this trial. Suppose 50 percent of the mice went right on the first trial. The states of the system are R (right) and L (left). a) Draw the stationary mark off source model for the given case. Assign some letters for transitions between states. Draw the tree diagram up to N=3 b) Find the prediction of the psychologist for the next two trials. c) When will the process stabilize?

7.The state diagram of a stationary Mark off source is shown in following figure a)Find the entropy of each state Hi b) Find the entropy of the source. c) Find G1 and G2 and verify that G1>G2>H
C 2 P1(1) = 1/3 P2(1) = 1/3 C A A 1 B A 3 B B C P3(1) = 1/3

8. The state diagram of a stationary Mark off source is shown in following figure a)Find the entropy of each state Hi b) Find the entropy of the source. c) Find G1 and G2 and verify that G1>G2>H

/8 1
3

(AA)
7

/8 2

/4 A

/4 A
3

P1 (1) 3 (AA) /4
7

2 18 7 18 7 18 2 18

B 4

/8 A (BB)
1

P2 (1)

(AB)

B
1

/4

/8 B

P3 (1)

P4 (1)

9. The state diagram of a stationary Mark off source is shown in following figure a)Find the entropy of each state Hi b) Find the entropy of the source. c) Find G1 and G2 and verify that G1>G2>H

1 p1 =

S L

S 2

S R

/2

P2 =

P3 =

10. Consider a DMS X with symbols xi and corresponding probabilities p (xi) =pi, where i=1, 2 a) Prove that H(X) <L b) Show that a code constructed in agreement with equation Log(1/p) n log(1/p)+1 will satisfy the following relation H(X) LH(X)+1

UNIT II objective questions:1. 2. 3. 4. 5. 6. 7. 8. What is meant by Source encoding? State Source encoding theorem? What is the physical significance of mutual information? Define Deterministic channel. Define noiseless channel. State the properties of mutual information. Draw the channel diagrams of a BSC and BEC? What is the significance of channel matrix and how can you obtain channel diagram from channel matrix? 9. Define rate of information transmission across the channel. 10. Prove that the mutual information of a channel is symmetric,i.e. I(X; Y) =I(Y; X) subjective questions:-

1. Prove that the following theorem on mutual information I(X; Y) =H(X) +H(Y)-H(X, Y) 2. Verify the following expressions: H(x, y) =H(x/y) +H(y) and H(x,y)= H(y/x) + H(x) 3. Define channel capacity and write the channel capacity expressions for Different special channels. 4. Consider the BSC and Show that mutual information is I(X,Y)=H(Y)+p log p+(1-p)log(1-p) 5. A discrete memory less source has an alphabet of seven symbols with probabilities 0.25, 0.25, 0.125, 0.125, 0.125, 0.0625, 0.0625 respectively. Compute Huffman code for this source; calculate entropy, average codeword length & variance of this code. 6. A zero memory source emits messages x1, x2with probabilities 0.8, 0.2 respectively. Find the optimum binary code for this source as well as for its 2nd and 3rdOrder extension (i.e. N=2 & N=3). Determine code efficiency in each case. 7. A computer executes 4 instructions that are designated by the code words {00, 01, 10, and 11}. Assuming that the instructions are used independently with probabilities {, , , }; calculate the percentage by which the number of bits used for the instructions may be reduced by the use of an optimum source code. Construct a Huffman code to realize the reduction. 8. One way of generating binary code words for messages consists of arranging the messages consists of arranging the messages in decreasing order of probability and dividing code words as

follows: The code word for the first message is 0 . The codeword for the ith message consists of (i-1) bits of 1s followed by a 0. The code word for the last message consists of all 1s; the number of bits in the code word for the last message is equal to the total number of messages that are to be encoded. a) Find the code words and the average number of bits per message used if the source emits one of five messages with probabilities , , 1/8, 1/16, and 1/16. What is the value of coding efficiency? b) Is this code uniquely decipherable? That is, for every possible sequence of bits, is there only one way of interpreting the messages?

9. Consider a discrete memory less source whose alphabet consists of K equiprobable symbols. a) Explain why the use of a fixed-length code for the representation of such source is about as efficient as any code can be. b) What conditions have to be satisfied by K and the code-word length for the coding efficiency to be 100 percent? 10. A discrete memory less source has an alphabet of seven symbols with probabilities 0.25, 0.25, 0.125, 0.125, 0.125, 0.0625, 0.0625 respectively. Compute Shannon fano code for this source; and what is the efficiency? 11. The technique used in constructing a source encoder consists of arranging the messages in decreasing order of probability and dividing the message into two almost equally probable groups. The messages in the first group are given the bit 0 and the messages in the second group are given the bit 1. The procedure is now applied again for each group separately, and continued until no further division is possible. Using this algorithm, find the code words for six messages occurring with probabilities 1/3, 1/3, 1/6, 1/12, 1/24, 1/24 12. Consider a discrete memory less source with alphabet{s0,s1,s2} and statistics{0.7,0.15,0.15} for its output a) Apply Huffman algorithm to this source. Hence, Show that the average codeword length of the Huffman code equals to 1.3 bits/symbol b) Let the source be extended to order two. Apply the Huffman algorithm to the resulting extended source, and S.T. average codeword length of new code is equal to 1.1975 bits /symbol c) Compare the average code word length calculated in part (b) with the entropy of the original source. 13.The joint probability matrix for a channel is given below. Compute H(X), H(Y), H(XY), H(X/Y) & H(Y/X) 0 0.2 0.05 0.05 0 0.1 0.1 0 P(X,Y) = 0 0 0.2 0.1 0 0.1 0.05 0.05

14. Consider the channel represented by the statistical model shown. Write the channel matrix
1

/3
1 1

Y1 /6
1 1

X1 /3

INPUT X2

/6 Y2

/6
1

OUTPUT

/3

1 1

/6 /3 Y3

and compute H(Y/X).

Unit-3
objective questions:Y4

1. A code consists of code words 1101000, 0111001, 0011010, 1001011, 1011100, and 0001101. If 1101011 is received, what is the decoded code word? 2. Describe how to obtain dual code of (n,k) linear block code. 3. Show that all error vectors that differ by a code vector have the same syndrome. 4. Show that the syndrome ''S'' is the sum (modulo 2) of those rows of matrix HT corresponding to the error location in the error pattern. 5.S.T. C={0 0 0, 1 1 1} is a linear code. 6. S.T. C={0 0 0 ,0 0 1,1 0 1} is not a linear code. 7. S.T. the Hamming distance measure has the following properties i) d(a,b)0 with = 0 if and only if a=b ii) d(a,b)=d(b,a) 8. Consider an AWGN channel with 4 kHz bandwidth and the noise PSD is n/2 = 10-12 w/Hz. The signal power required at the receiver is 0.1 mW. Find the capacity of the channel. 9.State Shannon-Hartley theorem. 10.Write two differences between block codes and convolutional codes. subjective questions:1. The generator matrix for a (6,3) block code is given below: G= { 100 |011 010 |101 001 |110 } Find the code words, weights of the code words and the minimum distance of the code. 2. Design a linear block code with a minimum distance of three and a message block size of eight bits.

3. A (6,3) linear block code is generated according to the generating matrix G= { 100 |101 010 |011 001 |110 } For a particular code word transmitted, the received code word is 100011. Find the corresponding data word transmitted. 4. Explain the error detection and error correction capabilities of linear block codes. 5. A system has a Bandwidth of 3 kHz and an S/N ratio of 29 dB at the input to the receiver. Find a) The information carrying capacity b) Capacity of the channel if it's Bandwidth is doubled while the transmitted signal power remains constant. 6. For a system, the Bandwidth is 4 kHz and S/N ratio is 14. If the Bandwidth is increased to 5 kHz, find the required S/N ratio to have the same channel capacity and find the percentage change in signal power.

7. Write the expression of differential entropy a random variable X. What is the probability density function f(x) for which H(X) is maximum and calculate the maximum of value of H(X).

8.Show that the channel capacity of an ideal AWGN channel with infinite bandwidth is given by C= 1.44 S/n (b/s) where S is the average signal power and n/2 is the power spectral density of white Gaussian noise.

9. An analog signal having 4-kHz bandwidth is sampled at 1.25times the Nyquist rate and each sample is quantized into one of 256 equally likely levels. Assume that the successive samples are statistically independent. a) What is the information rate of this source? b) Can the output of this source be transmitted without error over an AWGN channel with a bandwidth of 10 kHz and an S/N ratio of 20dB? c) Find the S/N ratio required for error-free transmission for part (b). d) Find the bandwidth required for an AWGN channel for error-free transmission of the output of this source if the S/N ration is 20 dB.

10. For a (6,3) systematic linear block code, the three parity-check bits c4,c5 and c6 are formed from the following equations: c4=d1 +d3 ; c5= d1 +d2 +d3 ; c6= d1+ d2 [ where + refers to modulo-2 addition] a) Write down the generator matrix G b) Construct all possible code words. c) Suppose that the received word is 010111. Decode this received word by finding the location of the error and the transmitted data bits.

11. A parity-check code has the parity-check matrix H = { 101|100 110 |010 011|001 } a) Determine the generator matrix G. b) Find the code word that begins with 101.. c) Suppose that the received word is 110110. Decode this received word. 12. A friend of yours says that he can design a system for transmitting the output of a minicomputer to a line printer operating at a speed of 30 lines/minute over a voice grade telephone line with a bandwidth of 3.5 kHz, and S/N =30 dB. Assume that the line printer needs 8 bits of data per character and prints out 80 characters per line. Would you believe him?
13.

Consider a (7, 4) linear block code with parity check matrix ''H'' given by H ={ 1 0 1 1 : 1 0 0 1101:010 0 1 1 1: 0 0 1 } a) Determine the generator matrix ?G?. b) Construct code words for message bits (1100), (1111), (0101) and (1010). c) Introduce the error in 5thbit for the code word corresponding to (1111), and find the error using syndrome calculator. 14) The GENERATOR matrix of a particular (7,3) linear block code is given by, 1 1 1 0: 1 0 0 G = 0 1 1 1: 0 1 0 1 1 0 1: 0 0 1

(i) Find the PARITY CHECK matrix (H) (ii) List all the code vectors (iii) What is the minimum distance between Code vectors? (iv) How many errors can be detected? (v) How many errors can be corrected?

15) (i) The parity check matrix of a (7,4) hamming code is given as follows: 1 1 1 0: 1 0 0 H = 0 1 1 1: 0 1 0 1 1 0 1: 0 0 1 Calculate the syndrome vector for single bit errors? (ii) Define and illustrate dual code and repeated code.

Unit-4: objective questions: 1. Show that the sequence C= {0000, 0101, 1010, 1111} is a linear cyclic code. 2. Show that the sequence C= {000, 100, 011, 111} is not cyclic code. 3. Consider given Cyclic code C= {0000, 0101, 1010, 1111}. Find the generator polynomial g(x) for ''C'' and show that every code polynomial is a multiple of g(x). 4. Show that the nonzero code polynomial of minimum degree in a cyclic code C is unique. 5. Find a generator polynomial g(x) for a (7, 4) cyclic code. subjective questions: 1. Consider a (7, 4) cyclic code with g(x) =1+x+x3. a) Let data word d= (1010). Find the corresponding code word. b) Let the code word c= (1100101). Find the corresponding data word. 2. Construct the syndrome evaluator table for the (7, 4) cyclic code with g(x) =1+x+x 3. 3. Consider a (7, 4) cyclic code with g(x) =1+x+x3. Let the sequence = (1110011) is received. Find the data word sent by assuming a) received word is non-systematic code word. b) received word is systematic code word. 4. Calculate all the possible(16) non-systematic code words if the generator polynomial of a (7,4) cyclic code is g(x) =1+x+x3 5. Calculate all the possible(16) systematic code words if the generator polynomial of a (7,4) cyclic code is g(x) =1+x+x3 6. Consider a (7, 4) cyclic code with generator polynomial g(x) = 1+x+x3. Let data word d= (1010). Find the corresponding systematic code word using encoder circuit consisting (nk) bit shift register.

7.Consider a (7, 4) cyclic code with g(x) =1+x+x 3. Find a generator matrix G for C and find the code word for d= (1010) using generator matrix. 8. Consider an (n, k) cyclic code C with generating polynomial g(x). Describe a procedure for construction of (n-k) X n parity check matrix H both for systematic and non-systematic codes. 9.Find the parity-check polynomial h(x) and the parity-check matrix H for the (7,4) cyclic code C with g(x)= 1+x+x3. 10. A (15,5) linear cyclic code has a generator polynomial g(x)= 1+x+x2+x4+x5+x8+x10 a) Draw block diagrams of an encoder and syndrome calculator for this code.

b) Find the code polynomial for the message polynomial for the message polynomial D(x) = 1+x2+x4 (in a systematic form). c) Is V(x) = 1+x4+x6+x8+x14 a code polynomial? 11. The generator polynomial for a (15, 7) cyclic code is g(x)=1+x 4+x6+x7+x8 Find the code vector (in systematic form) for the message polynomial D(x)= x2+x3+x4.

12. The generator polynomial for a (15, 7) cyclic code is g(x)=1+x 4+x6+x7+x8 Assume that the first and last bits of the code vector V(x) for D(x) = x2+x3+x4 suffer transmission errors. Find the syndrome of V(x). Unit-5:

objective questions: 1. 2. 3. 4. Define the term Constraint length (K) in convolution code. Define the term free distance in convolution codes. Define ''a burst of length q'' in burst-error-correcting codes. What is the number of check bits of a q-burst-error-correcting code in terms of n and k? 5. Define burst-correcting efficiency ''z'' of an (n , k) code. 6. Define Reiger bound of burst-error-correcting code. 7. If g(x) is the generator polynomial for the original code, then what is the generator polynomial for the interlaced code? 8. What is the significance of interlaced code? 9. Define rate efficiency of convolution code. 10. Define impulse response of the convolution encoder.

subjective questions:

1. Consider the (31, 15) Reed-Solomon code. a) How many bits are there in a symbol of the code? b) What is the block length in bits? c) What is the minimum distance of the code? d) How many symbols in error can the code correct? 2. Explain the operation of convolution encoder in both time domain and transform domain with an example. 3. A rate , constraint length 3, binary convolution encoder is generated by the following sequences g(1) =(111), g(2) =(101) i) Draw the encoder circuit ii) Draw the state diagram and trellis diagram. 4. Write brief notes a) shortened cyclic codes b) R-S codes c)golay codes 5. (2, 1, 3) Convolution Encoder is shown in following figure.

Represent state transition table for given convolution encoder. 6. Draw the state diagram and the augmented state diagram for computing the ''complete path enumerator function'' for the encoder shown in following figure. How will you obtain tree diagram from state diagram.

7. i) Find the impulse response of the following encoder. ii) Using the impulse response, determine how to get output code word for input data d=(1 0 1). iii) Find the Generator matrix G for obtaining code word for d=(1010)

8. Determine if the code is catastrophic or noncatastrophic using following convolution encoder.

9. a) Write the significance and formation of burst-and random-error-correcting codes b)Consider a (15,9) cyclic code generated by g(x) =1+x3+x4+x5+x6 .This code has a bursterror-correcting ability q=3. Find the burst-correcting efficiency of this code. 10. Consider a (7,3) cyclic code generated by g(x) = 1+x2+x3+x4 a) Find the burst-error-correcting ability of this code. b) Find the burst-error-correcting efficiency of this code. c) Construct an interlaced code derived from the (7, 3) code if it can correct bursts of length up to 10 using a message block size of 15 bits.

Вам также может понравиться