Академический Документы
Профессиональный Документы
Культура Документы
Course Books
Text:
References:
Digital Communications, Fourth Edition, J.G. Proakis, McGraw Hill, 2000.
Course Outline
Review of Probability
Signal and Spectra (Chapter 1)
Formatting and Base band Modulation (Chapter 2)
Base band Demodulation/Detection (Chapter 3)
Channel Coding (Chapter 6, 7 and 8)
Band pass Modulation and Demod./Detect.
(Chapter 4)
Spread Spectrum Techniques (Chapter 12)
Synchronization (Chapter 10)
Source Coding (Chapter 13)
Fading Channels (Chapter 15)
Todays Goal
Communication
Source
Transmitter
Channel
Receiver
Recipient
Brief Description
Types of information
Voice, data, video, music, email etc.
Information Representation
Why digital?
Formatting/Source Coding
Transforms source info into digital symbols (digitization)
Selects compatible waveforms (matching function)
Introduces redundancy which facilitates accurate decoding
despite errors
Modulation/Demodulation
Modulation is the process of modifying the info signal to
facilitate transmission
Demodulation reverses the process of modulation. It
involves the detection and retrieval of the info signal
Types
Coherent: Requires a reference info for detection
Noncoherent: Does not require reference phase information
Coding/Decoding
Translating info bits to transmitter data symbols
Techniques used to enhance info signal so that they are
less vulnerable to channel impairment (e.g. noise, fading,
jamming, interference)
Two Categories
Waveform Coding
Structured Sequences
Performance Metrics
P p( b b)
Main Points
Disadvantages
Requires reliable synchronization
Requires A/D conversions at high rate
Requires larger bandwidth
Nongraceful degradation
Performance Criteria
Probability of error or Bit Error Rate
Information Source
Character
Member of an alphanumeric/symbol (A to Z, 0 to 9)
Characters can be mapped into a sequence of binary digits
using one of the standardized codes such as
Digital Message
M - ary
Digital Waveform
Bit Rate
Baud Rate
Refers to the rate at which the signaling elements are
transmitted, i.e. number of signaling elements per
second.
x(t) = 5Cos(10t)
x(t) = x(t + T0 )
t denotes time
T0 is the period of x(t).
for
- < t <
(1.2)
x(t) is classified as an energy signal if, and only if, it has nonzero
but finite energy (0 < Ex < ) for all time, where:
T/2
Ex =
lim
T
T / 2
x (t) dt
x 2 (t) dt
(1.7)
A signal is defined as a power signal if, and only if, it has finite
but nonzero power (0 < Px < ) for all time, where
T/2
Px =
lim
T
1
2
x
(t) dt
T T / 2
(1.8)
(t) dt = 1
(1.9)
(t) = 0 for t 0
(t) is bounded at t 0
(1.10)
(1.11)
x(t ) (t-t
)dt = x(t 0 )
(1.12)
x( f ) X ( f )
(1.14)
According to Parsevals theorem, the energy of x(t):
Ex =
Therefore:
x 2 (t) dt =
Ex =
2
|X(f)|
df
(1.13)
(f) df
(1.15)
E x = 2 x (f) df
0
(1.16)
n=-
represented as:
1 0
2
2
Px
T0
x (t) dt
|C
n=-
T0 / 2
(1.17)
Px
(f) df 2 G x (f) df
0
(1.19)
1.4 Autocorrelation
R x ( ) =
x(t) x (t + ) dt
for
- < <
(1.21)
R x ( ) =R x (- )
R x ( ) x (f)
the
energy of the
signal
2
R x (0)
(t) dt
R x ( )
T /2
lim
T
1
x(t) x (t + ) dt
T T / 2
(1.22)
When the power signal x(t) is periodic with period T0, the
autocorrelation function can be expressed as
1
R x ( )
T0
T0 / 2
T0 / 2
x(t) x (t + ) dt
(1.23)
R x ( ) =R x (- )
R x ( ) Gx (f)
R x (0)
1
T0
T0 / 2
T0 / 2
x 2 (t) dt
All useful message signals appear random; that is, the receiver
does not know, a priori, which of the possible waveform have been
sent.
FX ( x) P ( X x)
dFX ( x )
PX ( x )
dx
x p X ( x )dx
E{ X 2 }
x 2 p X ( x)dx
var( X ) E{( X m X ) 2 }
( x m X ) 2 p X ( x) dx
var( X ) E{ X 2 } E{ X }2
2. Random Processes
E{ X (tk )}
xp
Xk
( x) dx mX (tk )
(1.30)
(1.31)
1 n
1
p ( n)
exp
2
2
(1.40)
Gn ( f )
watts / hertz
N0
Rn ( ) {Gn ( f )}
( )
2
1
p ( n)
N0
df
2
y (t ) x(t ) h(t )
x( )h(t )d
x( )h(t )d
0
X(f )
H ( f ) H ( f ) e j ( f )
(1.50)
( f ) tan 1
Im{H ( f )}
Re{H ( f )}
(1.51)
If a random process forms the input to a timeinvariant linear system,the output will also be a
random process.
(1.53)
The output signal from an ideal transmission line may have some
time delay and different amplitude than the input
It must have no distortionit must have the same shape as the
input.
For ideal distortionless transmission:
y (t ) Kx (t t0 )
(1.54)
Y ( f ) KX ( f )e j 2 ft0
(1.55)
H ( f ) Ke j 2 ft0
(1.56)
1 d ( f )
( f )
2 df
H ( f ) H ( f ) e j ( f )
(1.58)
Where
1
H( f )
0
for | f | f u
for | f | f u
(1.59)
j ( f )
j 2 ft0
(1.60)
Ideal Filters
h(t ) 1{H ( f )}
H ( f )e j 2 ft df
fu
e j 2 ft0 e j 2 ft df
fu
fu
e j 2 f ( t t0 ) df
fu
sin 2 f u (t t0 )
2 fu
2 f u (t t0 )
2 f u sin nc 2 f u (t t0 )
Ideal Filters
e j ( f )
1 j 2 f
1 (2 f ) 2
Figure 1.13
Realizable Filters
Phase
characteristic of RC filter
Figure 1.13
Realizable Filters
Hn ( f )
1
1 ( f / fu )
2n
n 1
(1.65)
Butterworth filters
are popular because
they are the best
approximation to the
ideal, in the sense of
maximal flatness in
the filter passband.
Theorems of
communication and
information theory are
based on the
assumption of strictly
bandlimited channels
The mathematical
description of a real
signal does not permit
the signal to be strictly
duration limited and
strictly bandlimited.
(1.73)
sin ( f f c )T
Gx ( f ) T
(
f
f
)
T
c