Вы находитесь на странице: 1из 7

Grow More Faculty of Engineering

GTU Question Bank


Sub: DCR (2161603) Branch: 6TH IT
UNIT-1

Q-1. What is data compression? Differentiate between lossless compression techniques and lossy
compression techniques.
[MAY 2017-1 MARKS] [NOV 2017-3 MARKS] [MAY 2019-3 MARKS] [NOV-2018-4
MARKS]
Q.2. How modelling and coding are useful for the development of data compression
algorithm. [NOV 2017-4 MARKS] [DEC 2019-3 MARKS]
Q.3. Explain types of data compression. List out applications of data retrieval.
[OCT 2016-7 MARKS] [MAY 2018- 4 MARKS]
OR
State the models for lossless compression and explain any one in

Q.4. Explain modeling and coding. Explain how this will help to reduce entropy for following
data. [MAY 2016-7 MARKS] [MAY 2017-7 MARKS]
9,11,11,11,14,13,15,17,16,17,20,21

Q-5. Given an alphabet A = {a1, a2, a3, a4}, find the first order entropy in the following
case: P(a1) = 0.505, P(a2) = ¼, P(a3) = 1/8 and P(a4) =0.12.
[MAY 2018-3 MARKS]

UNIT-2

Q.1. Explain different types of models in data compression.


[NOV 2017-7 MARKS] [MAY 2019-4 MARKS] [NOV 2018-4 MARKS]
1. Physical model
2. Probability model [DEC 2019-4 MARKS]
3. Markov model [NOV 2017-7 MARKS] [OCT 2016-7 MARKS]
4. Composite model
Q-2. What are uniquely decodable codes? Explain with an example.
[MAY 2017-4 MARKS] [MAY 2019-3 MARKS] [NOV 2018-3 MARKS] [DEC 2019-7
MARKS]

OR
Write a short note on uniquely decidable codes.

Q-3. What is uniquely decodable code explain? Determine whether the following codes are
uniquely decodable or not. [MAY 2018-3 MARKS]

1. {0,01,11,111}
2. {0,01,110,111}
3. {0,10,110,111}
4. {1,10,110,111}
Q.4. Explain Prefix code with example. And instantaneous code.
[MAY 2017-3 MARKS] [NOV 2017-3 MARKS] [MAY 2019-3 MARKS] [NOV 2018-3
MARKS] [DEC 2019-7 MARKS]

Q.5. How to measure the performance of multiple Data Compression algorithms?


Explain parameters to select one algorithm out of many. [NOV 2018-7 MARKS]

UNIT-3

Q.1. Write a procedure to generate adaptive Huffman code.


[MAY 2017-3MARKS] [MAY 2018-7 MARKS]
OR
Explain update procedure of adaptive huffman codding with suitable example.

Q.2. Explain Huffman Coding with respect to minimum variance Huffman codes with
separate trees.
[MAY 2017-7 MARKS] [NOV 2017-7 MARKS] [MAY 2016-7 MARKS]
[MAY 2018- 4 MARKS] [NOV 2018-7 MARKS]
OR
Explain Huffman coding with suitable example.
OR
Determine the minimum variance Huffman code with the given probabilities.
P(a1) = 0.2, P(a2) = 0.4, P(a3) = 0.2, P(a4) = 0.1 and P(a5) = 0.1
OR
Design a minimum variance Huffman code for a source that put out letter from
an alphabet A={ a1, a2, a3, a4, a5, a6} with P(a1)=P(a2)=0.2, P(a3)=0.25,
P(a4)=0.05, P(a5)=0.15,P(a6)=0.15.Find the entropy of the source, avg. length
of the code and efficiency. Also comment on the difference between Huffman
code and minimum variance Huffman code.

Q.3. Write different applications of Huffman Coding. [MAY 2017-3 MARKS]

Q.4. Encode “acadebaa” using Adaptive Huffman code. Derive, Codes and final tree.
[MAY 2017-7 MARKS] [MAY 2018-3 MARKS] [NOV 2018-7 MARKS]
OR
Consider a source containing 26 distinct symbols [A-Z]. Encode given
sequence of symbols using Adaptive Huffman algorithm.
Symbol Sequence: MUMMY
Q.5. Generate GOLOMB code for m=5 and n=4 to 10.
[MAY 2017-4 MARKS] [NOV 2017-4 MARKS] [MAY 2016-7 MARKS] [DEC 2019-3
MARKS]
OR
Generate GOLOMB code for m=5 and n=0 to 10.
OR
Generate GOLOMB code for m=9 and n=8 to 13.

Q.6. Generate TUNSTALL CODE P(A)=0.4, P(B)=0.3, P(C)=0.3 and n=3 bits.
[MAY 2017-4 MARKS] [NOV 2017-4 MARKS] [MAY 2019-7 MARKS] [OCT 2016-7
MARKS] [MAY 2016-7 MARKS] [MAY 2018-7 MARKS] [DEC 2019-3 MARKS]
OR
Write a short note on TUNSTALL CODE.
OR
Explain TUNSTALL CODE with suitable example.
OR
Write procedure to generate TUNSTALL code. Generate TUNSTALL code with
probability of P(A)=0.6, P(B)=0.3, P(C)=0.1 and n=3 bits.
Q.7. Explain Raise Code in brief. [NOV 2017-3 MARKS]

Q.8. Draw encoding and decoding procedure flowchart of adaptive Huffman coding.
[MAY 2019-7 MARKS] [NOV 2018-7 MARKS]

Q.9. Consider a source emits letter from a alphabet A={a1,a2,a3,a4} with probability
P(a1)=0.3,P(a2)=0.2,P(a3)=0.35,P(a4)=0.15. [OCT 2016-7 MARKS]
[I] Find a Huffman code using minimum variance procedure.
[II] Find average length of the code.

Q.10. An alphabet S ={ a1, a2, a3 ,a4 ,a5} symbols with probabilities as P(a1)= 0.4,
P(a2)=0.3, P(a3)=0.2, P(a4)=0.09, and P(a5)=0.01, Find out Huffman code,
source entropy, average length and compression ratio.
[NOV 2018-4 MARKS] [DEC 2019-7 MARKS]
OR
Design a minimum variance Huffman code for a source that put out letter from an
alphabet A={a1,a2,a3,a4,a5,a6} with P(a1)=P(a2) = 0.2 ,
P(a3)=0.25,P(a4)=0.05,P(a5)=0.15,P(a6)=0.15.Find the entropy of the source, avg.
length of the code and efficiency.

Q.11. How Extended Huffman reduces code average length Code? Prove using
alphabet A={a1,a2,a3} with probability 0.95,0.03 and ,0.02 respectively.
[NOV 2018-7 MARKS]

UNIT-4

Q.1. Define Arithmetic Coding. Encode and Decode “AABBC” with arithmetic coding.
(P(A)=0.6, P(B)=0.3, P(C)=0.1)
[MAY 2017-7 MARKS] [NOV 2017-7 MARKS] [MAY 2016-7 MARKS] [NOV 2018-4
MARKS]
OR
Encode and Decode “BACBA” with arithmetic coding. (P(A)=0.5,P(B)=0.3,P(C)=0.2)
OR
Given source with probabilities of symbols as P(A)=0.45 P(B)=0.25, P(C)=0.15,
P(D)=0.15. Perform encoding of string "BCADB" using arithmetic coding and generate
tag. [NOV 2018-4 MARKS]
OR
Using given probabilities P(A)=0.2, P(B)=0.2, P(C)=0.2, P(D)=0.4. Decode
tag 0.14496 for atleast five symbols.

Q.2. Compare Arithmetic coding with Huffman coding.


[MAY 2019-4 MARKS] [NOV 2018-3 MARKS]
Q.3. Explain Uniqueness and efficiency of the arithmetic code. [MAY 2019-3 MARKS]

Q.4. Write the method to generate a tag in arithmetic coding.


[MAY 2019-4 MARKS] [OCT 2016-7 MARKS]

Q.5. Write an encoding algorithm for arithmetic coding with example.


[OCT 2016-7 MARKS] [DEC 2019-7 MARKS]

Q.6. The probability model is given by P(a1) = 0.2, P(a2) = 0.3 and P(a3) = 0.5. Find the
real valued tag for the sequence a1a1 a3 a2 a3 a1. (Assume cumulative probability
function: F(0) = 0) [MAY 2018-7 MARKS]

Q.7. Write pseudocode for integer arithmetic encoding and decoding algorithm.
[MAY 2018-7 MARKS]

UNIT-5

Q.1. Encode the following sequence using Diagram Coding of Static Dictionary method
(Generate for 3 bit): abracadabra [MAY 2017-7 MARKS] [MAY 2019-3 MARKS]
OR
Explain diagram coding with example.
Q.2. Given an initial dictionary Index 1=w, 2=a, 3=b, encode the following
message using the LZ78 algorithm: [MAY 2017-7 MARKS]
wabba/bwabba/bwabba/bwabba/bwoo/bwoo/bwoo.

Q.3. List out the different techniques for lossless compression and explain LZ77 with example.
[NOV 2017-7 MARKS]

Q.4. List out the different techniques for lossless compression and explain LZW with example.
[NOV 2017-7 MARKS]

Q.5. Given an initial dictionary index 1=w, 2=a, 3=b, 4=$, 5=o encode and decode the
following message using LZW algorithm: wabba$woo
[MAY 2019-7 MARKS] [NOV 2018-7 MARKS] [DEC 2019-4 MARKS]
OR
Encode and Decode following sequence using LZW Coding technique.
Sequence: ABABABAB
OR

Write limitation, advantages and disadvantages of LZW method.

Q.6. Encode the following sequence cabracadabrarrarrad using LZ77 Method. Assume window
size of 13 and look ahead buffer of size 6.
[MAY 2019-7 MARKS] [OCT 2019-7 MARKS] [[MAY 2016-7 MARKS] [MAY 2018-7
MARKS] [NOV 2018-3 MARKS] [DEC 2019-7 MARKS]
OR
A sequence is encoded using the LZ77 algorithm. Given that C(a) = 1, C(b) = 2,
C(r) = 3, and C(t)= 4, decode the following sequence of triples:
<0, 0, 3>,< 0, 0, 1>,<0, 0, 4>,< 2, 8, 2>,< 3, 1, 2>,<0, 0, 3>,<6, 4, 4>,<9,5, 4>
Assume that the size of the window is 20 and the size of the look-ahead buffer is
10. Encode the decoded sequence and make sure you get the same sequence of
triples.
OR

Encode the following sequence using the LZ77 and LZ78 algorithm:
ḇarrayarḇbarḇbyḇbarrayarḇba. Assume you have a window size of 30 with a look-ahead buffer
of size 15. Furthermore assume that C(a)=1, C(b)=2, C(ḇ)=3, C(r)=4, and C(y)=5. *
OR
Explain process generating triple in all three possible cases of LZ77 algorithm.
OR

Apply LZ77 method to encode following input string: cabracadabrarrarrad, Assume


Window size =13, Search buffer size=7, Look-ahead buffer size=6.

Q.7. Explain LZ78 encoding procedure.


[OCT 2019-7 MARKS] [MAY 2018-7 MARKS] [NOV 208-4 MARKS]

Q.8. Explain LZW method with example.


[OCT 2019-7 MARKS] [MAY 2016-7 MARKS] [NOV 208-4 MARKS]
OR

Given an initial dictionary consisting of the letters a b r y ḇ, encode the following message using
the LZW algorithm: aḇbarḇarrayḇbyḇbarrayarḇbay.
Q.9. Compare & Contrast Static Dictionary Based Algorithm vs. Dynamic Dictionary Based
Algorithm. [DEC 2019-4 MARKS]

Q.10. What is difference (s) between LZ77 and LZ78 method? [DEC 2019-4 MARKS]

UNIT-6

Q.1. Explain OLD JPEG Standard and JPEG-LS.


[MAY 2017-3 MARKS] [MAY 2019-7 MARKS] [MAY 2016-7 MARKS] [DEC 2019-4
MARKS]

Q.2 Explain Prediction with Partial match (ppm) in detail. And encode any string using ppm.
[MAY 2017-3 MARKS] [MAY 2019-4 MARKS] [OCT 2016-7MARKS] [NOV 2018-7
MARKS] [DEC 2019-7 MARKS]
Q.3. Encode the sequence this/bis/bthe using Burrows-Wheeler transform and move to
front coding. [MAY 2017-7 MARKS]

Q.4. Explain information retrieval in detail. [MAY 2017-4 MARKS]

Q.5. Explain CALIC.


[NOV 2017-7 MARKS] [MAY 2019-3 MARKS] [OCT 2016-7 MARKS] [DEC 2019-7
MARKS]
Q.6. Explain Facsimile Encoding and Exclusion principle in detail.
[MAY 2019-4 MARKS]

Q.7. Encode the sequence etaḇcetaḇandḇbetaḇceta using Burrows-Wheeler transform and


move to front coding.*
[MAY 2016-7 MARKS]

Q.8. Explain and compare Incident matrix and Inverted index with example.
[MAY 2016-7 MARKS]

Q.9.Explain usage of discrete cosine transform (DCT) in JPEG. [MAY 2018-7 MARKS]
[NOV 2018-3 MARKS]

Q.10. What is significance of Quantization and Zigzag Coding in JPEG Compression?


[NOV 2018-4 MARKS]

Q.11. Draw and Explain Block diagram for Baseline JPEG Algorithm. [NOV 2018-4 MARKS]

UNIT-7

Q.1. Compare uniform quantization with non uniform quantization.


[NOV 2017-3 MARKS] [MAY 2019-4 MARKS] [MAY 2018-3 MARKS] [NOV 2018-3
MARKS] [DEC 2019-3 MARKS]
OR
Explain nonuniform quantization.

Q.2. List out different types of quantizer.and explain the Quantization Problem with example.
[OCT 2016-7 MARKS]

Q.4. Explain adaptive quantization with its two approaches.


[OCT 2016-7 MARKS] [DEC 2019-4 MARKS]

Q-5. Explain pdf optimized quantization. [MAY 2018-3 MARKS] [DEC 2019-3 MARKS]

Q.6. Explain Sampling and Quantization of an Audio Signal [NOV 2018-3 MARKS]

UNIT-8

Q.1. Explain how Vector Quantization is better than Scalar Quantization with example.
[MAY 2017-7 MARKS] [MAY 2019-3 MARKS] [DEC 2019-3 MAEKS]

Q.2. Explain scalar quantization in brief.


[NOV 2017-4 MARKS] [MAY 2019-7 MARKS] [MAY 2016-7 MARKS]

Q.3. Explain vector quantization in brief.


[NOV 2017-4 MARKS] [MAY 2019-7 MARKS] [MAY 2016-7 MARKS] [DEC 2019-7
MARKS]

Q.4. Explain pyramid vector quantization. [MAY 2019-3 MARKS] [MAY 2018-3 MARKS]
Q.5. Write a short note on tree structure vector quantization. [OCT 2016-7 MARKS]

Q.6. Explain structured vector quantizers. [MAY 2018-3 MARKS]

Q.7. Explain Linde-Buzo-Gray algorithm in detail. [MAY 2018-4 MARKS]

UNIT-9

Q.1. Write a short note on skip pointer with example.


[MAY 2017-4 MARKS] [NOV 2017-3 MARKS] [MAY 2016-7 MARKS] [MAY 2018-4
MARKS]

Q.2. Write a short note on Phrase queries with example.


[MAY 2017-4 MARKS] [MAY 2016-7 MARKS] [MAY 2018- 4 MARKS]

Q.3. Explain Biword indexes and Positional indexes in brief.


[NOV 2017-3 MARKS] [DEC 2019-4 MARKS]

Q.4. Explain Lemmatization and Stemming in detail.


[NOV 2017-7 MARKS] [MAY 2019-7 MARKS] [MAY 2016-7 MARKS] [MAY 2018-7
MARKS] [DEC 2019-4 MARKS]

Q.5. Explain incident matrix and inverted index with suitable example.
[MAY 2018-4 MARKS]

Q.6. Explain tokenization in detail. [MAY 2018-4 MARKS]

Q.7.Explain Dropping common terms: stop words. [DEC 2019-3 MARKS]

UNIT-10

Q.1. Differentiate between Text-centric and Data-centric XML retrieval. [OCT 2016-7 MARKS]

Q.2. Write a short note on positional index and Data-centric XML retrieval [OCT 2016-7
MARKS]

Q.3. Explain the vector space model for XML retrieval. [MAY 2017-4 MARKS] [NOV 2017-4
MARKS] [NOV 2018-4 MARKS]

Q.4. Explain different challenges in XML retrieval. [NOV 2017-3 MARKS] [MAY 2019-3 MARKS]
[MAY 2016-7 MARKS] [MAY 2018-4 MARKS]

Вам также может понравиться