Академический Документы
Профессиональный Документы
Культура Документы
Scope of Information Theory 1. Determine the irreducible limit below which a signal cannot be compressed. 2. Deduce the ultimate transmission rate for reliable communication over a noisy channel. 3. Dene Channel Capacity - the intrinsic ability of a channel to convey information. The basic setup in Information Theory has: a source, a channel and destination. The output from source is conveyed through the channel and received at the destination. The source is a random variable S, & %
'
which takes symbols from a nite alphabet i.e., S = {s0 , s1 , s2 , , sk1 } with probabilities P (S = sk ) = pk where k = 0, 1, 2, , k 1 and
k1
pk = 1
k=0
The following assumptions are made about the source 1. Source generates symbols that are statistically independent. 2. Source is memoryless i.e., the choice of present symbol does not depend on the previous choices. Information: Information is dened as & %
'
I(sk ) = logbase
1 pk
&
'
Properties of Information
1. Information conveyed by a deterministic event is nothing i.e., I(sk ) = 0, f or pk = 1a .
examples about deterministic events, sun rising in the east as opposed to that of the occurence of the Tsunami
a Give
&
4. More information is conveyed by a less probable event than a more probable event I(sk ) > I(si ), f or pk < pi a
program that runs without faults as opposed to a program that gives a segmentation fault
aa
&
'
Properties of Information
Entropy: The Entropy(H(s)) of a source is dened as: the average information generated by a discrete memoryless source. Entropy denes the uncertainity in the source.
K1
H(s) =
k=0
pk I(sk )
Properties of Entropy: 1. If K is the number of symbols generated by a source, then its entropy is bounded by 0 H(s) log2 K & %
'
2. If the source has an inevitable symbol sk for which pk = 1 then H(s) = 0 3. If the source generates all the symbols equiprobably i.e., 1 pk = K , k, then H(s) = log2 K
Proof: H(s) 0 If the source has one symbol sk which has P (s = sk ) = pk = 1, then according to the properties, all other probabilities should be zero i.e., pi = 0, i, i = k. Therefore, from the denition of entropy
&
'
K1
$ H(s) =
k=0
pk log 1 pk
1 pk
H(s) log2 K Consider a discrete memoryless source which has two dierent probability distributions & p0 , p1 , , pk %
pk log
k=0
qk pk
1 K1 = pk loge loge 2
k=0
qk pk
(1)
loge x x 1 &
(2) %
$ qk pk 1 K1 pk loge 2
k=0
pk log
k=0
qk 1 pk
K1
1 loge 2 1
K1
qk
k=0 k=0
pk (3)
K1
(1 1) = 0 loge 2
=
k=0
pk log
qk pk
0
K1
Suppose, qk = &
1 K,
k then,
k=0
qk log
1 qk
= log2 K So, %
'
in Eq. 3 becomes
K1 K1
$ pk log pk 0
k=0 K1 K1
pk log qk
k=0
=
k=0 K1
pk log K
k=0 K1
pk log pk 1 pk 1 pk
=
k=0
pk log K
k=0 K1 K1
pk log
= log K
k=0 K1
pk
k=0
pk log
=
k=0
pk log
1 pk
log2 K
'
H(s) = p0 log2 p0 (1 p0 ) log2 (1 p0 ) According to the properties of Entropy, H(s) has a maximum value when p0 = 1 as demonstrated in Figure 1. 2 & %
'
H(s) 1
1/2
1 p
&