Вы находитесь на странице: 1из 13

'

Information Theory and Source Coding

Scope of Information Theory 1. Determine the irreducible limit below which a signal cannot be compressed. 2. Deduce the ultimate transmission rate for reliable communication over a noisy channel. 3. Dene Channel Capacity - the intrinsic ability of a channel to convey information. The basic setup in Information Theory has: a source, a channel and destination. The output from source is conveyed through the channel and received at the destination. The source is a random variable S, & %

'

which takes symbols from a nite alphabet i.e., S = {s0 , s1 , s2 , , sk1 } with probabilities P (S = sk ) = pk where k = 0, 1, 2, , k 1 and
k1

pk = 1
k=0

The following assumptions are made about the source 1. Source generates symbols that are statistically independent. 2. Source is memoryless i.e., the choice of present symbol does not depend on the previous choices. Information: Information is dened as & %

'

I(sk ) = logbase

1 pk

In digital commmunication, data is binary, the base is always 2.

&

'

Properties of Information
1. Information conveyed by a deterministic event is nothing i.e., I(sk ) = 0, f or pk = 1a .

2. Information is always positive i.e., I(sk ) 0 f or pk 1

examples about deterministic events, sun rising in the east as opposed to that of the occurence of the Tsunami
a Give

&

' 3. Information is never lost i.e., I(sk ) is never negative

4. More information is conveyed by a less probable event than a more probable event I(sk ) > I(si ), f or pk < pi a

program that runs without faults as opposed to a program that gives a segmentation fault
aa

&

'

Properties of Information

Entropy: The Entropy(H(s)) of a source is dened as: the average information generated by a discrete memoryless source. Entropy denes the uncertainity in the source.
K1

H(s) =
k=0

pk I(sk )

Properties of Entropy: 1. If K is the number of symbols generated by a source, then its entropy is bounded by 0 H(s) log2 K & %

'

2. If the source has an inevitable symbol sk for which pk = 1 then H(s) = 0 3. If the source generates all the symbols equiprobably i.e., 1 pk = K , k, then H(s) = log2 K

Proof: H(s) 0 If the source has one symbol sk which has P (s = sk ) = pk = 1, then according to the properties, all other probabilities should be zero i.e., pi = 0, i, i = k. Therefore, from the denition of entropy

&

'
K1

$ H(s) =
k=0

pk log 1 pk

1 pk

= pk log = 1 log(1) =0 Therefore H(s) 0

H(s) log2 K Consider a discrete memoryless source which has two dierent probability distributions & p0 , p1 , , pk %

' and q0 , q1 , , qk on the alphabet S = s0 , s1 , , sk We begin with the equation


K1

pk log
k=0

qk pk

1 K1 = pk loge loge 2
k=0

qk pk

(1)

To analyze Eq. 1, we consider the following inequality.

loge x x 1 &

(2) %

' Now, using Eq.2 in Eq. 1,


K1

$ qk pk 1 K1 pk loge 2
k=0

pk log
k=0

qk 1 pk
K1

1 loge 2 1

K1

qk
k=0 k=0

pk (3)

K1

(1 1) = 0 loge 2

=
k=0

pk log

qk pk

0
K1

Suppose, qk = &

1 K,

k then,
k=0

qk log

1 qk

= log2 K So, %

'

in Eq. 3 becomes
K1 K1

$ pk log pk 0
k=0 K1 K1

pk log qk
k=0

=
k=0 K1

pk log K
k=0 K1

pk log pk 1 pk 1 pk

=
k=0

pk log K
k=0 K1 K1

pk log

= log K
k=0 K1

pk
k=0

pk log

=
k=0

pk log

1 pk

log2 K

Therefore H(s) log2 K &

'

Entropy of a binary memoryless channel


Consider a discrete memoryless binary source shown dened on the alphabet S = {0, 1}. Let the probabilities of symbols 0 and 1 be p0 and 1 p0 respectively. The entropy of this channel is given by

H(s) = p0 log2 p0 (1 p0 ) log2 (1 p0 ) According to the properties of Entropy, H(s) has a maximum value when p0 = 1 as demonstrated in Figure 1. 2 & %

'

H(s) 1

1/2

1 p

Figure 1: Entropy of a discrete memoryless binary source

&

Вам также может понравиться