Вы находитесь на странице: 1из 3

Course Code Course Title L T P J C

ECE4007 INFORMATION THEORY AND CODING 3 0 0 4 4


Pre-requisite ECE4001 : Digital Communication Systems

Course Objectives:
 Learn the various concepts of information source and channel capacity
 Analyze the source coding techniques such as Shanan Fano Encoding, Huffman Coding,
arithmetic Coding.
 Lean statistical techniques for signal detection
 Construct the various channel coding schemes such as block codes, cyclic codes and
convolutional codes.

Expected Course Outcome:


The students will be able to
 Apply mathematical models that describes the behavior of information source and channel
capacity and the performance of source coding and channel coding techniques
 Solve mathematical problems in source coding and channel coding techniques and
implement in Matlab

Student Learning Outcomes (SLO): 1, 2, 18

Module:1 Introduction 4 hours SLO: 1, 2


Review of Probability Theory, Introduction to information theory

Module:2 Entropy 6 hours SLO: 1, 2


Uncertainty, self-information, average information, mutual information and their properties -
Entropy and information rate of Markov sources - Information measures of continuous random
variables.

Module:3 Channel Models and Capacity 5 hours SLO: 1, 2


Importance and types of various channel models - Channel capacity calculation – Binary
symmetric channel, binary erasure channel - Shannon’s channel capacity and channel coding
theorem - Shannon’s limit.

Module:4 Source Coding I 6 hours SLO: 1, 18


Source coding theorem - Huffman coding - Non binary Huffman codes - Adaptive Huffman
coding - Shannon Fano Elias coding - Non binary Shannon Fano codes

Module:5 Source Coding II 6 hours SLO: 1, 18


Arithmetic coding - Lempel-Ziv coding - Run-length encoding and rate distortion function -
Overview of transform coding.
Module:6 Channel Coding I 8 hours SLO: 1, 18
Introduction to Error control codes - Block codes, linear block codes, cyclic codes and their
properties, Encoder and Decoder design- serial and parallel concatenated block code, Convolution
Codes- Properties, Encoder-Tree diagram, Trellis diagram, state diagram, transfer function of
convolutional codes, Viterbi Decoding, Trellis coding, Reed Solomon codes.

Module:7 Channel Coding II 8 hours SLO: 1, 18


Serial and parallel concatenated convolutional codes, Block and convolutional interleaver, Turbo
coder, Iterative Turbo decoder, Trellis coded modulation-set partitioning - LDPC Codes.

Module:8 Contemporary issues 2 hours

Total Lecture hours: 45 hours


Text Book(s)
1. Simon Haykin, “Communication Systems”, 2012,4th Edition, Wiley India Pvt Ltd, India.
2. Ranjan Bose, “Information Theory, Coding and Cryptography”, 2015, 1st Edition, McGraw
Hill Education (India) Pvt. Ltd., India.
Reference Books
1. John G. Proakis, “Digital Communications”, 2014, 5th Edition, McGraw-Hill, McGraw Hill
Education (India) Pvt. Ltd., India.
2. Bernard Sklar and Pabitra Kumar Ray “Digital Communications: Fundamentals and
Applications”, 2012, 1st Edition, Pearson Education, India.
3. Khalid Sayood, “Introduction to Data Compression”, Reprint: 2015, 4th Edition, Elsevier,
India.
Mode of Evaluation: Continuous Assessment Test –I (CAT-I), Continuous Assessment Test –II
(CAT-II), Digital Assignments/ Quiz / Completion of MOOC, Final Assessment Test (FAT).
Typical Projects SLO: 18
1. Efficient Image compression technique by using modified SPIHT algorithm
2. Develop the compression algorithms by using Discrete Wavelet Transform
3. Compress and decompress an Image using Modified Huffman coding
4. Apply Run length coding and Huffman encoding algorithm to compress an image.
5. Adaptive Huffman coding of 2D DCT coefficients for Image compression
6. Compress of an image by chaotic map and Arithmetic coding
7. Region of Interest based lossless medical image compression
8. Write a code to build the (3, 1, 3) repetition encoder. Map the encoder output to BPSK symbols.
Transmit the symbols through AWGN channel. Investigate the error correction capability of the
(3, 1, 3) repetition code by comparing its BER performance to that without using error correction
code.
9. Write a code to compare the BER performance and error correction capability of (3, 1, 3) and
(5, 1, 5) repetition codes. Assume BPSK modulation and AWGN channel. Also compare the
simulated results with the theoretical results.
10. Write a code to compare the performance of hard decision and soft decision Viterbi decoding
algorithms. Assume BPSK modulation and AWGN channel.
11. Write a code to build (8, 4, 3) block encoder and decoder. Compare the BER performance of
(8, 4, 3) block coder with (3,1,3) repetition codes. Assume BPSK modulation and AWGN
channel.
12. Consider the following Extended vehicular A channel power delay profile. Write a code to
model the given profile. Also measure the channel capacity. Compare the obtained capacity to
that without fading channel.
Delay (ns) Power (dB)
0 0
30 -1.5
150 -1.4
310 -3.6
370 -0.6
710 -9.1
1090 -7
1730 -12
2510 -16.9

13. Performance analysis of various channels (BSC, BEC, Noiseless, Lossless) under AWGN.
14. FPGA implementation of linear block coding and syndrome decoding.
15. Performance of linear block codes under single error and burst error.
16 .Performance of analysis of convolution codes under single error and burst error
17. Implementation of VITERBI decoding in FPGA.
18. Efficiency checking of different interleaver for turbo encoder.
19. Implementation of trellis code modulator in FPGA.
20. Developing the Compression algorithms for Wireless multimedia sensor networks.
Mode of Evaluation: Review I, II and III.

Вам также может понравиться