Вы находитесь на странице: 1из 15

By

Saima Israil


Feb 20th, 2014
Introduction
General Model of Communication
Shannons Information Theory
Source Coding theorem
Channel Coding Theory
Information Capacity Theorem
Conclusion


In defining information, Shannon identified the
critical relationships among the
elements of a communication system
the power at the source of a signal
the bandwidth or frequency range of an
information channel through which the signal
travels
the noise of the channel, such as unpredictable static
on a radio, which will alter the signal by the time it
reaches the last element of the System
the receiver, which must decode the signal.
all communication involves three steps

Coding a message at its source

Transmitting the message through a
communications channel

Decoding the message at its
destination.
Noise can be considered data without meaning;
that is, data that is not being used to transmit a
signal, but is simply produced as an unwanted
by-product of other activities. Noise is still
considered information, in the sense of
Information Theory.

A quantitative measure of the disorder of a
system and inversely related to the amount of
energy available to do work in an isolated
system. The more energy has become
dispersed, the less work it can perform and the
greater the entropy.
The theorem can be stated as follows:
Given a discrete source of entropy H(S) , the average code-word
length L for any distortionless source coding is bounded as
L > H(S)

This theorem provides the mathematical tool for assessing data
compaction

The source coding theorem is also known as the "noiseless coding
theorem" in the sense that it establishes the condition for error-free
encoding to be possible
For reliable communication , needs channel encoding &
decoding.
any coding scheme which gives the error as small as possible,
and which is efficient enough that code rate is not too small?

Let Source with alphabet X have entropy H(X) and produce
symbols once every Ts, and Channel have capacity C and be
used once every Tc . Then,


i) if , there exists a coding scheme.


) if , it is not possible to transmit with arbitrary
small error.

c s
T
C
T
X H
s
) (
c s
T
C
T
X H
>
) (
This Theorem defines an upper limit to the capacity of a link,
in bits per second (bps), as a function of the available
bandwidth and the signal-to-noise ratio of the link.

The Theorem can be stated as:
C = B * log2(1+ S/N)

where C is the achievable channel capacity, B is the
bandwidth of the line, S is the average signal power and N is
the average noise power.
The phrase signal-to-noise ratio, often abbreviated
SNR or S/N, is the ratio between the magnitude of a
signal (meaningful information) and the magnitude of
background noise.

The signal-to-noise ratio (S/N) is usually expressed in
decibels (dB) given by the formula:
10 * log10(S/N)
so for example a signal-to-noise ratio of 1000 is
commonly expressed as
10 * log10(1000) = 30 dB.

2
/
Define ideal system as ,
where is the Tx energy per bit
Then, log (1 )
2 1

/
For infinite bandwidth channel
ln 2 0.693 1.6

b
b b
b
o
C B
b
o
b
o
B
R C
P E C E
E C C
B N B
E
N C B
E
dB
N
>
=
=
= +

=
| |
= = =
|
\ .
2
lim log
B
o
P
C C e
N

>
= =
-10 0 10 20 30 40 50
0.1
1
10
20
Eb/No(dB)
C/B
-1.6
Rb < C
Rb > C
Rb = C
Shannon's limit
If the SNR is 20 dB, and the bandwidth available
is 4 kHz, which is appropriate for telephone
communications, then C = 4 log2(1 + 100) = 4
log2 (101) = 26.63 kbit/s. Note that the value of
100 is appropriate for an SNR of 20 dB.



If it is required to transmit at 50 kbit/s, and a
bandwidth of 1 MHz is used, then the minimum
SNR required is given by 50 = 1000 log2(1+S/N)
so S/N = 2C/W -1 = 0.035 corresponding to an
SNR of -14.5 dB. This shows that it is possible
to transmit using signals which are actually
much weaker than the background noise level.
o Shannons Information Theory provide us the
basis for the field of Information Theory

o Source coding is bounded as L > H(S)

o if , there exists a coding scheme.

o Channel capacity C = B * log2(1+ S/N)

c s
T
C
T
X H
s
) (

Вам также может понравиться