Вы находитесь на странице: 1из 158

INFORMATION THEORY AND

CODING
5th SEM E&C

JAYANTHDWIJESH H P M.tech (DECS)


Assistant Professor – Dept of E&C

B.G.S INSTITUTE OF TECHNOLOGY (B.G.S.I.T)


B.G Nagara, Nagamangala Tq, Mandya District- 571448
FORMULAS FOR REFERENCE

MODULE –1(INFORMATION THEORY)

 Amount of information or Self information.


𝟏 𝟏 𝟏
𝑰𝑲 = log ( 𝑷 ) or 𝑰𝑲 = 𝐥𝐨𝐠 𝟐 𝟏( 𝑷 ) or I (𝒎𝑲 ) = log ( 𝑷 )
𝑲 𝑲 𝑲

 Entropy of source or Average information content of the source.


𝑴 𝟏 𝑴 𝟏
H= 𝒊=𝟏 𝑷𝒊 𝐥𝐨𝐠 𝟐 𝟏 ( 𝑷 ) bits/symbol or H = 𝑲=𝟏 𝑷𝑲 𝐥𝐨𝐠 𝟐 𝟏 ( 𝑷 ) bits/symbol or
𝒊 𝑲

𝒒 𝟏 𝒒 𝟏
H(S) = 𝒊=𝟏 𝑷𝒊 𝐥𝐨𝐠 𝟐 𝟏 ( 𝑷 ) bits/symbol or H(S) = 𝒊=𝟏 𝑷𝒊 𝐥𝐨𝐠( 𝑷 ) bits/symbol or
𝒊 𝒊

𝑵 𝟏
H(S) = 𝑲=𝟏 𝑷𝑲 𝐥𝐨𝐠 𝟐 𝟏 ( 𝑷 ) bits/symbol
𝑲

 Information rate or average information rate.


𝑹𝑺 = 𝒓𝒔 H(S) bits/sec or R= 𝒓𝒔 H bits/sec or R=r H bits/sec
 Bits.
𝟏
𝑰𝑲 = 𝐥𝐨𝐠 𝟐 𝟏( 𝑷 ) bits
𝑲

 Hartley’s or Decits.
𝟏
𝑰𝑲 = 𝐥𝐨𝐠 𝟏𝟎 𝟏( 𝑷 ) Hartley’s or Decits
𝑲

 Nats or Neper.
𝟏
𝑰𝑲 = 𝐥𝐨𝐠 𝒆 𝟏( 𝑷 ) Nats or Neper.
𝑲

 Extremal or Upper bound or Maximum entropy


𝑯(𝑺)𝒎𝒂𝒙 = 𝐥𝐨𝐠𝟐 𝒒 bits/message-symbol or 𝑯(𝑺)𝒎𝒂𝒙 = 𝐥𝐨𝐠𝟐 𝑵 bits/message-symbol.
 Source efficiency
𝑯(𝑺) 𝑯(𝑺)
𝜼𝑺 = or 𝜼𝑺 = X 𝟏𝟎𝟎%
𝑯(𝑺)𝒎𝒂𝒙 𝑯(𝑺)𝒎𝒂𝒙

 Source redundancy
𝑯(𝑺)
𝑹 𝜼𝑺 = 1- 𝜼𝑺 = (1 - 𝑯(𝑺)𝒎𝒂𝒙
) X 𝟏𝟎𝟎%

 The average information content of the symbols emitted from the i th state.
𝟏
𝑯𝒊 = 𝒏
𝒋=𝟏 𝑷𝒊𝒋 𝐥𝐨𝐠 𝟐 𝟏 ( ) bits/symbol or
𝑷𝒊𝒋

𝟏
𝑯𝒊 = 𝒏
𝒋=𝟏 𝑷𝒊𝒋 𝐥𝐨𝐠 𝒐 𝟏 ( ) bits/symbol
𝑷𝒊𝒋

 The average information content of the symbols emitted from the k th state.
𝟏
𝑯𝒌 = 𝑴
𝒍=𝟏 𝑷𝒍𝑲 𝐥𝐨𝐠 𝟐 𝟏 ( ) bits/symbol
𝑷𝒍𝑲

 The average information content per symbol in a message of length N.


𝟏 𝟏 𝟏 𝟏
𝑮𝑵 = 𝑵 𝒊 𝑷( 𝒎𝒊 )log or 𝑮𝑵 = − 𝒊 𝑷( 𝒎𝒊 )log P (𝒎𝒊 ) = 𝑵 H ( 𝒔)
𝑷(𝒎𝒊 ) 𝑵

 The entropy of the second order symbols.


𝟏
𝑮𝑵 =𝑵 H ( 𝒔𝒙𝑵 ) where N=2.

 The entropy of the third order symbols.


𝟏
𝑮𝑵 =𝑵 H ( 𝒔𝒙𝑵 ) where N=3.

 Log properties
𝟏
1. 𝐥𝐨𝐠 𝒂 𝒃 =
𝐥𝐨𝐠 𝒃 𝒂
𝐥𝐨𝐠 𝒙 𝒃
𝟐. = 𝐥𝐨𝐠 𝒂 𝒃
𝐥𝐨𝐠 𝒙 𝒂

𝟑. 𝐥𝐨𝐠 𝒆 𝟏𝟎 = ln (10)

Вам также может понравиться