Академический Документы
Профессиональный Документы
Культура Документы
Prepared by:
Amit Degada
Teaching Assistant,
ECED, NIT Surat
Information Theory
It is a study of Communication Engineering
plus Maths.
A Communication Engineer has to Fight with
Limited Power
Inevitable Background Noise
Limited Bandwidth
Information Measure
This is utilized to determine the information rate of
discrete Sources
Information Measure
Also we can state the three law from Intution
Information Measure
Rule 2: The Information Content I(mk) must be Non
Negative contity.
It may be zero
Mathematically I(mk) >= 0 as 0 <= Pk <=1
e.g. Sun Rises in West.
Information Measure
Rule 3: The Information Content of message
having Higher probability is less than the
Information Content of Message having
Lower probability
Mathematically I(mk) > I(mj)
Information Measure
Also we can state for the Sum of two messages that the
information content in the two combined messages is
same as the sum of information content of each
message Provided the occurrence is mutually
independent.
e.g. There will be Sunny weather Today.
There will be Cloudy weather Tomorrow
Mathematically
I (mk and mj) = I(mk mj)
= I(mk)+I(mj)
Information measure
Here b may be 2, e or 10
If b = 2 then unit is bits
b = e then unit is nats
b = 10 then unit is decit
ln v log10 v
ln 2 log10 2
Example
A Source generates one of four symbols
1.
2.
The
Successive
symbols
are
statistically
independent and come out at avg rate of r symbols
per second
Pi 1
i 1
Bits/Symbol
Information Rate
Information Rate = Total Information/ time taken
Here Time Taken
n
Tb
r
Information rate
nH
n
r
R rH
R
Bits/sec
Some Maths
H satisfies following Equation
0 H log 2 M
Maximum H Will occur when all the message having equal Probability.
Hence H also shows the uncertainty that which of the symbol will occur.
As H approaches to its maximum Value we cant determine which message
will occur.
Consider a system Transmit only 2 Messages having equal probability of
occurrence 0.5. at that Time H=1
And at every instant we cant say which one of the two message will occur.
Variation of H Vs. p
Lets Consider a Binary Source,
means M=2
Let the two symbols occur at the probability
p and
1-p Respectively.
Where o < p < 1.
So Entropy can be
1
(1
p
)
log
2
p
1
H p log 2
( p )
Variation of H Vs. P
Now We want to obtain the shape of the curve
dH d ( p )
0
dp
dp
1 p
log
0
p
Verify it by Double differentiation
d 2H
1
1
0
dp 2
p 1 p
Example
R rH
Also
H max log 2 M
Hence
R max r log 2 M
N piNi
i 1
Where
Ni=Code length in Binary
digits (binits)
R H
1
rb N
K 2
Ni
i 1
Example
Find The efficiency and Krafts inequality
mi
pi
Code I
Code II
Code III
Code IV
00
01
01
10
10
10
011
110
11
11
0111
111
Example
Messages
Pi
Mi
Coding Procedure
No. Of
Bits
Code
M1
M2
1/8/
100
M3
1/8
101
M4
1/16
1100
M5
1/16
1101
M6
1/16
1110
M7
1/32
11110
m8
1/32
11111
Questions
Thank You