Академический Документы
Профессиональный Документы
Культура Документы
5120
Communications
Theory
Chapter
2
Deterministic
and
Random
Signal
Analysis
Jen-Ming
Wu
Inst.
Of
Communications
Engineering
Dept.
of
Electrical
Engineering
National
Tsing
Hua
University
Email:
jmwu@ee.nthu.edu.tw
Fall,
2017
NTHU
COM5120
-Communications
Theory
Outline
l Deterministic
Signal
Analysis
Fourier
and
Hilbert
analysis
Bandpass and
lowpass signal
representation
l Fourier
Series
2
For a periodic signal x(t) with period T0, x0 (t ) dt <
-
x (t ) = x (t - nT ) = c
n =-
0 0
n =-
n exp( j 2p nf ot )
1
where cn =
T0 -
x0 (t ) exp(- j 2p nf ot )dt , n = 0, 1, 2,...
l Please
check
the
book
for
the
F.T.
pairs
and
properties.3
NTHU
COM5120
-Communications
Theory
Fourier
Transform
of
Periodic
Signals
2
l Given
a
periodic signal x(t) with period T0, x0 (t ) dt <
-
x (t ) = x (t - nT ) = x (t ) * d ( t - nT )
n =-
0 0 0
n =-
0
d ( t - nT ) f d ( f - nf )
n =-
0 0
n =-
0
then X ( f ) = X 0 ( f ) F d ( t - nT0 )
n=-
= X 0 ( f ) f 0 d ( f - nf 0 )
n =-
= f0 X ( nf ) d ( f - nf )
n =-
0 0 0
where X 0
( f ) = - x0 (t ) exp ( - j 2p ft )dt
4
NTHU
COM5120
-Communications
Theory
Fourier
Transform
of
Periodic
Signals
2
l For
the
periodic signal x(t) with period T0, x0 (t ) dt <
-
x (t ) = x (t - nT ) = x (t ) * d (t - nT )
n =-
0 0 0
n =-
0
X ( f ) = X 0 ( f ) f0 d ( f - nf )
n =-
0
X( f ) is a
= f0 X
n =-
0 ( f 0 )d ( f - nf 0 ) discrete signal
x( t ) is a
periodic signal x(t ) X(f )
t f
-T0 T0
- f 0 f 0 2 f 0 3 f 0
5
NTHU
COM5120
-Communications
Theory
Hilbert
Transform
l Hilbert
transform
1 x(t )
1
x ( t ) = dt = x(t ) *
p - t - t p t
l Inverse
Hilbert
transform
1 x (t )
1
x ( t ) = - dt = - x (t ) *
p - t - t p t
l F.T. versus H.T. in communication systems?
The
F.T.
is
used
for
evaluating
the
frequency
content
that
provides
the
math
basis
for
analyzing
frequency
response.
The
H.T.
shifts
the
phase
of
a
signal
so
that
the
phase
shift
between
signals
can
be
utilized
to
separate
signal
in
communication
systems.
6
NTHU
COM5120
-Communications
Theory
Hilbert
Transform
l The
Fourier
Transform
of
H.T.
pairs
x(t ) X ( f )
x ( t ) X ( f )
X(f ) sgn( f ) X ( f )
X (0)
X (0)
f
f
-W W -W W
- X (0)
9
NTHU
COM5120
-Communications
Theory
Pre-envelopes
l Motivation
Q1:
How
to
represent
positive/negative
frequency
parts
of
a
signal
in
time
domain?
Q2:
How
can
we
modify
the
frequency
content
of
a
real-valued
signal
x(t)
?
For
example,
elimination
of
the
negative
frequency
components
of
a
signal
in
time
domain?
lPre-envelope
for
positive
frequency:
1 j
x+ (t ) = x(t ) + x (t )
2 2
1 1 X(f )
X + ( f ) = X ( f ) + sgn( f ) X ( f )
2 2 X (0)
X ( f ), f >0
1
f
= X (0), f =0
2
0, f <0 -W W
10
NTHU
COM5120
-Communications
Theory
Pre-envelopes
lPre-envelope
for
negative
frequency:
1 j
x- (t ) = x(t ) - x (t )
2 2
X(f )
1 1
X - ( f ) = X ( f ) - sgn( f ) X ( f ) X (0)
2 2
0, f >0
1 f
= X (0), f =0 -W W
2
X ( f ), f <0
l The
pre-envelope
signal
represents
the
positive
or
negative
frequencies
of
a
signal
and
is
usually
complex
in
time
domain.
11
NTHU
COM5120
-Communications
Theory
Pre-envelopes
lProperties
of
pre-envelope
signal
(1) x+ (t ) X + ( f ) : positive frequency part of X ( f )
x- (t ) X - ( f ) : negative frequency part of X ( f )
(2) x(t ) = x+ (t ) + x- (t )
X ( f ) = X + ( f ) + X - ( f )
12
NTHU
COM5120
-Communications
Theory
Complex
Envelope
and
Band-pass
Signal
l Given
the
real-valued
symmetric
band-pass
signal
X ( f ) = X + ( f ) + X - ( f ) = X + ( f ) + X +* ( - f )
14
NTHU
COM5120
-Communications
Theory
Outline
l Deterministic
Signal
Analysis
Fourier
and
Hilbert
analysis
Bandpass and
lowpass signal
representation
E[g ( X )] = g ( X ) f X (x )dx
or g ( X )P (x )
X
X
When g (X )= X , then E
g (X ) = E [X ]= mX (mean)
2
When g (X )= (X - mX ) ,
g (X ) = E
(
2
X)
2
then E X - m = s X (variance)
17
NTHU
COM5120
-Communications
Theory
Properties
18
NTHU
COM5120
-Communications
Theory
Properties
3.
Moment
generating
function
g ( X ) = e sX , "s
E[g ( X )] = E e [ ] y
sX
X (s )
Moment
generating
function
d
y X (s ) = E [X ]= mX First
moment
ds s= 0
2
d
2
y X (s ) = E
X
2
= s 2
X + m2
X Second
moment
ds s= 0
19
NTHU
COM5120
-Communications
Theory
Bounds
on
tail
probability
(1) Markov
Inequality
Given
a
non-negative
r.v.
X ,
and
a > 0, a R +
E[X ]
P[X a ]
a
proof
E [ X ] = x f X ( x ) dx x f X ( x ) dx
- a
a f X ( x ) dx = a P [ X a ]
a
20
NTHU
COM5120
-Communications
Theory
Bounds
on
tail
probability
(2)
Chernov Bound
P X a e y X (s ) , " a > E[X ]
[ ] - sa
Proof s>0
P [ X a ] = P e e
sX sa
By
Markov
inequality
E e sX
P e sX esa = e- say X ( s ) , "a > E [ X ] , "s > 0
e sa
Note:
The Chernov bound is a function of s.
There exists an optimal value of s that minimize the Chernov
bound which depends on the distribution function of X. 21
NTHU
COM5120
-Communications
Theory
Some
useful
random
variables
(1)
Bernoulli
random
variable
X
P[X = 1] = p , P[X = 0] = 1 - p
X = E[X ] = p
2
X [
= E ( X - X ) = p(1 - p )
2
]
(2)
Binomial
random
variable
X n
=
Sum
of
n indep.
Bernoulli
r.v.s = X
i =1
i
n k
P[X = k ] = p (1 - p ) , k = 0,1,!, n
n -k
k
X = np ; X2 = np (1 - p )
22
NTHU
COM5120
-Communications
Theory
Some
useful
random
variables
(3)
Gaussian
random
variable
X
X is
a
Gaussian
r.v.
if
- ( X - X )2
1
f X (x ) = e 2 2X
,
2
2 X
denoted by X ~ N ( X , X2 )
23
NTHU
COM5120
-Communications
Theory
Let X ~ N (0, 1) , define Q - function
Q (x) 1 FX (x ) P [X x]
- t2
1
= e dt , x > mX
2
x 2p
= The tail probability of a Gaussian function
Note 1 1 x
erfc ( x ) e dt Q ( x ) = erfc
-t 2
x p 2 2
Let X ~ N (0, s 2
X ) , then
-t 2
1 x
P[X x ] =
2s X2
e dt = Q
x 2p s X s X
24
NTHU
COM5120
-Communications
Theory
Chernov bound
of
Gaussian
random
variable
- x 2
Q (x ) exp
proof 2
By
Chernov bound
Q (x ) e- sxy X (s ) = e- sx E[esX ], "s
The
tightest
bound
(i.e.
min.)
of
the
Chernov bound
occurs
when
s
{ - sx
e E e }= 0
sX
- sx sX
e E Xe
- sx
xe E e
sX
0
E sX
Xe sX
xE e
25
NTHU
COM5120
-Communications
Theory
where
- t 2
(i) E e [ ] sX
=
1
st 2
e e dt
2p -
s2 s2
1 s t 2 2
=
2p -
exp - - st + dt e = e
2 2
2 2
[ ] [ ]
s
d
(ii) E Xe sX
= E e = se
sX 2
ds s2 s2
[
E Xe sX
]- xE[e ] = 0 se
sX 2
- xe 2
= 0 s = x
The
Chernov bound
at
s = x is
then
s2 x2 - x 2
Q (x ) e E[e - sx sX
]= e - sx 2
e =e e - x 2 2
=e 2
, Q.E.D26
NTHU
COM5120
-Communications
Theory
Tighter
than
the
Chernov bound?
1 - x 2
n It
can
be
shown
that
Q (x ) exp
2 2
Proof: Given
X
N
~ ( )
0
,
1
,
then
1 u 2
1 v 2
1 u 2 + v2
2 - - -
(Q (x)) =
e du 2
e dv =
2
e 2
dudv
2p x 2p x
2p x x
2 2 2 - 1 u
Let r = u + v , q = tan
v
r2
2 1 -
(Q (x)) e 2
rdrd q
2p r q
r2 p
1 -
2p re 2x
2
dr
0
2
dq
1
2
r x2
e- p 1 - x2 1 -
2
e , Q (x) e 2
2p
2 4 2
2x
27
NTHU
COM5120
-Communications
Theory
Some
useful
random
variables
(4)
Complex
random
variable
X
=
X1+j X2
Let X1 and
X2 be
jointly
Gaussian
r.v.
with
joint
prob
density
f (X1, X2 ).
X = X + jX ; = E[( X - X )( X - X )
1 2
2
X
*
]
If
X1 and
X2 are
i.i.d.
(independent
and
identically
distributed)
Gaussian
r.v.s with
X
1
=
X
2
(e.g.
N0/2)
2 2
X2 = X2 1 + X2 2
In
this
case,
X is
called
circularly
symmetric
complex
Gaussian
or X ~ CN (X , X2 )
28
NTHU
COM5120
-Communications
Theory
Some
useful
random
variables
(5)
Rayleigh
distributed
random
variable
If
X
1
and
X
2
are
i.i.d.
Gaussian
r.v.s with
X 1 , X 2 ~ N (0, ) ,
then
X
= 2
1
+
X
X 2 2
2
is
a
Rayleigh
r.v.
with
PDF
- x 2
x
f X (x ) = 2 e , x > 0 2 2
p 2
X = ; X = 2 -
2
2 2
Example: Let Y= X 1 +j X 2 , which is a complex Gaussian random variable.
If X = Y , then X = X12 + X 22 is a Rayleigh distributed r.v.
29
NTHU
COM5120
-Communications
Theory
Some
useful
random
variables
(6)
Rician distributed
random
variable
If
X1 and
X2 are i.i.d. Gaussian r.v.s with
X1 , X 2 ~ N ( , ),
then
X
=
X
12
+
2
X
22
is a Rician
r.v. x 2 + u X2
x u X x
f X ( x ) = 2 exp - I o ( 2 ), x > 0
s 2s 2
s
- K2 K K
X = e (1 + K ) I 0 ( 2 ) + KI1 ( 2 ) ;
2
2
E[X 2 ]=2 2 + 2 , where K =
2
Example: Let Y= X 1 +j X 2 , which is a complex random variable.
If X = Y , then X = X12 + X 22 is a Rician distributed r.v.
30
NTHU
COM5120
-Communications
Theory
Some
useful
random
variables
2
(7)
Chi-square
(c ) distributed
random
variable
If
X
i
,
i
=
1,
2,
K
,
n
are
i.i.d.
Gaussian
r.v.
with
X
i
~
N
(
0
),
then X = X 1 + X 2 ! + X n
, 2 2 2 2
2
n n n
2 G
2
where
G
(
x
)
=
t
x
-
1
e
-
t
dt
is
the
gamma
function
0
Special
casen = 2, X = X 12 + X 22 is
the
power
of
complex
Gaussian
r.v.
Y = X 1 + jX 2
31
NTHU
COM5120
-Communications
Theory
Random
Vector
X = [X 1 , X 2 ,!, X N ] is
a
finite
collection
T
1 1
f X (X ) = 1
T -1
exp- ( X - X ) C X ( X - X )
(2p ) 2 C X 2 2
N
where C X = determinant of C X
33
NTHU
COM5120
-Communications
Theory
Random
Process
(r.
p.)
A
r.p. {X(t)} is
an
infinite
collections
of
r.v.s
with
mapping
onto
a
functional
space
34
NTHU
COM5120
-Communications
Theory
For
fixed
t
0
,
X
(
t
)
(
t
0
)
is
a
random
variable
X
(
t
)
X
i
(
t
)
is
a
sample
function
For
fixed
s
i
,
X
35
NTHU
COM5120
-Communications
Theory
Statistical
Properties
36
NTHU
COM5120
-Communications
Theory
Remarks
We
say
two
random
processes
X(t) and Y(t) are
(1) Uncorrelated
if
E X ( t1 ) Y ( t2 ) = E X ( t1 ) E Y ( t2 )
= X ( t1 ) Y ( t2 ) , "t1 , t2
(2)
Orthogonal
if E X (t1 )Y (t2 ) = 0, " t1 , t2
(3)
Independent
if
FXY (x1 K xn , y1 K yn )= FX (x1 K xn )FY (y1 K yn )
or f XY (x1 K xn , y1 K yn )= f X (x1 K xn ) fY (y1 K yn ), " n
37
NTHU
COM5120
-Communications
Theory
Stationary
random
process
(1)
Strict
Sense
Stationary
(SSS)
n
,
the
r.v.s {X (t1 ),!, X (tn )} and
For
"
{X (t1 + t ),!, X (tn + t )} are
identically
distributed,
i.e.
f X (t1 )K X (tn ) (x1 K xn )= f X (t1 + t )K X (tn + t ) (x1 K xn ), " n
Q: What is the physical meaning of stationary?
A: A stationary process means that the observation of
X(t) statistics is shift-invariant.
SSS
concerns
any
high
order
of
statistics
and
is
generally
difficult
to
prove.
38
NTHU
COM5120
-Communications
Theory
Stationary
random
process
(2)
Wide
Sense
Stationary
(WSS)
A
r.p. is
said
to
be
WSS
if
(i) Mean: X (t ) = Const.
(ii) Autocorrelation: a
function
of
time
difference
t1 t2 only.
RX ( t1 , t2 ) = E X ( t1 ) X * ( t2 )
= E X ( t1 + t ) X ( t2 + t )
*
= RX ( t1 - t2 )
WSS
concerns
only
the
1st and
the
2nd orders
of
statistics.
Q:
Why
should
we
care
about
WSS?
A:
The
signal
amplitude
and
power
are
what
we
can
observe
typically. 39
NTHU
COM5120
-Communications
Theory
Stationary
random
process
(3)
Cyclo-Stationary
(Cyclo-S)
X (t ) is
cyclo-stationary
with
period
T if
f X (t1 )!X (tn ) ( x1 ! xn ) = f X (t1 +T )!X (tn +T ) ( x1 ! xn ) ,
"n, "(t1,!, tn )
The
statistical
properties
are
periodic.
(4)
Wide-Sense
Cyclo-Stationary
(WSCS)
A r.p. X(t) is WSCS if the mean and autocorrelation
functions are periodic.
(i) E X
(t ) E X (t kT ) , k Z
(ii) RX (t1,t2 ) = RX (t1 + kT,t2 + kT ) , "k Z, "t1, t2
NTHU
COM5120
-Communications
Theory
40
Ex. Let
s(t) be
a
finite
energy
signal
pulse
and
the
discrete
r.p. bk be
WSS
with
mean
b
and
the
autocorrelation
depends
on
n only, Rb (n ) = E bk + n bk* [ ]
Then
the
transmitted
waveform
X (t ) = b s(t - kT )
k is
WS
cyclo-stationary
k = -
proof
(i)
The
1st order
statistics
X (t )= E
X
(t ) = E [bk ] s (t - kT )
k
= b s (t - kT )= b s (t + T - kT )
k k
= mX (t + T )
41
NTHU
COM5120
-Communications
Theory
proof
(ii)
The
2nd order
statistics
RX (t1 , t2 )= E
X (t ) X *
(t )
1 2
= E
b
b
k l
*
s (t - kT )s *
(t - lT )
k l
=
k l
Rb (k - l ) s (t - kT )s (t - lT ) *
=
k l
Rb
( ) ( ) (
k - 1 - l - 1 s t + T - kT ) (t + T - lT )
s *
= RX (t1 + T , t2 + T )
where X = E[X ] ,
( )(
R N N
)
T
= E X - X -
X X
X 1 - X
1
( )
= E
M
(
X 1 - X1 ,K , X N - X N ) ( )
(
X N - X N )
f X (x1,! xn ) is
characterized
by
X (t ) and RX (t1, t2 ) , "t1,t 2
43
NTHU
COM5120
-Communications
Theory
Gaussian
Random
Process
(
t
)
is
a
complex
Gaussian
r.p. if "n,
Def.
X
"t1,!, tn , X (t1 ),!, X (tn ) are
jointly
Gaussian
complex
random
variables
with
joint
PDF
1 1
f X (x1 ,! xn ) = n -1
exp- ( X - X ) ( X - X )
H
p L 2
f X (x1,! xn ) is
characterized
by
[
X (t ) and RXX * (t1, t2 ) = E X (t1 )X * (t2 ) ]
NoteA
WSS real
Gaussian
r.p. X (t )
i.e. X (t ) = Const. and RX (t1,t2 ) = RX (t1 - t2 )
(
t
)
is
also
a
SSS
then
X 44
NTHU
COM5120
-Communications
Theory
Ergodicity
(
t
)
is
ergodic in
the
mean
if
Def.
X
1 T2
E[X (t )] = lim T X (t )dt = X
T T - 2
statistical
average
time
average
Y ( t ) = h (t -t ) X (t ) dt
[
RY (t1,t2 ) = E Y (t1 )Y * (t2 ) ]
= E[ h (t 1 )X (t1 - t 1 ) dt 1 h* (t 2 )X * (t2 - t 2 ) dt 2 ]
[
= dt 1h (t 1 ) dt 2h (t 2 ) E X (t1 - t 1 )X (t2 - t 2 )
- -
* *
]
= dt 1h (t 1 ) dt 2h (t 2 )RX (t1 - t2 - t 1 + t 2 )
*
- -
47
NTHU
COM5120
-Communications
Theory
Passing
a
WSS
r.p. X(t) through
a
LTI
system
h(t)
Let t = t1 - t2
RY (t1 ,t2 )= dt 1 h (t 1 ) d t 2 h (t 2 )RX (t - t 1 + t 2 )
*
= RX (t )* h (t )* h (- t ) *
RY (t1 ,t2 ) is f X of
= RY (t ) H(f) H*(f)
The LTI output Y(t) is still WSS !
Note If we define S X ( f )= F {RX ( )}
SY ( f )= F {RY ( )}
2
then SY ( f )= S X ( f ) H ( f )
S X ( f ) is called the power spectral density of X (t ).
48
NTHU
COM5120
-Communications
Theory
Spectral
Analysis
DefineThe
power
spectral
density
of X (t )
is
S ( f ) = RX (t ) e - j 2pft dt
-
Properties
of
S ( f )
(1)
S
X
(
0
)
=
RX ( )d d.c. power of X (t )
power = E[X (t )] = RX (0 )
-
(2)
Average
2
= S X ( f ) df Area of PSD
-
where RX ( ) = S X ( f ) e j 2 f
df
-
49
NTHU
COM5120
-Communications
Theory
Properties
of
S ( f )
50
NTHU
COM5120
-Communications
Theory
Serial
Expansion
of
Random
Process
(1)
Sampling
Theorem (for
deterministic
signals)
Let the deterministic real signal x(t) be bandlimited
with bandwidth W then x(t) can be represented by the
discrete samples x[n] = x(nT )
at Nyquist rate 1 2W , i.e.
T
x (t ) =
x (nT
n = -
) sinc {
(t - nT )
T
}
n n
= x sinc t - 2W
n 2W 2W
= x[n ]fn (t )
n
51
NTHU
COM5120
-Communications
Theory
QDo
we
have
an
equivalent
sampling
theorem
for
the
random
process?
Can
we
represent
the
r.p with
a
set
of
random
variables?
For a bandlimited r.p. X(t) with S X ( f ) 0, f W
n
X (t )= X (nT ) sinc 2W t -
1442 443 2W
Seq. of r.v. 14444444
42 444444443
n
f n (t )
r.p.
= X nf n (t )
n
b b 1 , n = m
jn ( t ) dt = 1 , jn ( t ) j ( t ) dt =
2
*
m
a a
0, n m 53
NTHU
COM5120
-Communications
Theory
The basis jn ( t ) are eigen functions of the cov function,
b
C X ( t1 ,t2 ) jn ( t2 ) dt2 = lnj n ( t ) , a < t < b
a
{
and CX ( t1 ,t2 ) = E X ( t1 ) - X ( t1 ) X ( t2 ) - X ( t2 )
*
}
= RX ( t1 ,t2 ) - X ( t1 ) X * ( t2 )