Вы находитесь на странице: 1из 40

Discrete-time Random Signals

Until now, we have assumed that the signals are deterministic, i.e., each value of a sequence is uniquely determined. In many situations, the processes that generate signals are so complex as to make precise description of a signal extremely difficult or undesirable. A random or stochastic signal is considered to be characterized by a set of probability density functions.

Continuous-time Random Signals

A continuous-time random signal (or random process) is a signal x(t) whose value at each time point is a random variable. Random signals appear often in real life. Examples include:

1. The noise heard from a radio receiver that is not tuned to an operating channeL 2. The noise heard from a helicopter rotor. 3. Electrical signals recorded from a human brain through electrodes put in contact with the skull (these are called electroencephalograms, or EEGs).

4. Mechanical vibrations sensed in a vehicle moving on a rough terrain. 5. Angular motion of a boat in the sea caused by waves and wind. 6. Television signal 7. Radar signal

Stochastic Processes

Random (or stochastic) process (or signal)

A random process is an indexed family of random variables characterized by a set of probability distribution function. A sequence x[n], <n< . Each individual sample x[n] is assumed to be an outcome of some underlying random variable Xn. Difference between a single random variable and a random process for a random variable the outcome of a randomsampling experiment is mapped into a number, whereas for a random process the outcome is mapped into a sequence.

Stochastic Processes (continue)

Probability density function of x[n]: p (xn , n ) Joint distribution of x[n] and x[m]: p (xn , n , xm , m )

The pdf is varying with the time index n.

The joint pdf of time indices n and m

Eg., x1[n] = Ancos(wn+n), where An and n are random variables for all < n < , then x1[n] is a random process.

Independence and Stationary

x[n] and x[m] are independent iff

p (x n , n , x m , m ) = p (x n , n ) p (x m , m )

x is a stationary process iff

p (x n + k , n + k , x m + k , m + k ) = p (x n , n , x m , m )
for all k. That is, the joint distribution of x[n] and x[m] depends only on the time difference m n.

Stationary (continue)

In particular, the above definition is also applicable to the situation of m=n. Hence, a stationary random process should also satisfy

p (x n + k , n + k ) = p (x n , n )

That is, the pdf of a stationary process is not varying with the time index n. It implies that x[n] is shift invariant.

Stochastic Processes vs. Deterministic Signal

In many of the applications of discrete-time signal processing, random processes serve as models for signals in the sense that a particular signal can be considered a sample sequence of a random process. Although such a signals are unpredictable making a deterministic approach to signal representation is inappropriate certain average properties of the ensemble can be determined, given the probability law of the process.

Statistics: Expectation

Mean (or average)

m xn = {xn } =

xn p (xn , n )dxn

Dependent with n

denotes the expectation operator

{g (xn )} = g (xn ) p (xn , n )dxn

For independent random variables

{xn y m } = {xn } {y m }

Statistics: Mean Square Value and Variance

Mean squared value

{ xn } =
2

xn p (xn , n )dxn
2

Variance
2 var {xn } = xn m xn

Autocorrelation and Autocovariance

Autocorrelation

correlation between time indices m and n of a process

xx
=

{n,m} = {x x }
n m

Autocovariance

xn xm p (xn , n , xm , m )dxn dxm

covariance between time indices m and n of a process

xx {n,m} = xn m xn xm m xm
= xx {n,m} m xn m xm

{(

)(

)* }

Stationary Process

According to the definition of stationary process, the autocorrelation of a stationary process is dependent only on the time difference m n. Hence, for stationary process, we have

m x = m xn = {xn } = (x n m x )

If we denote the time difference by k, we have

2 x

Independent to n

xx (n + k , n ) = xx (k ) =

xn + k xn

Dependent only to the time difference k

Wide-sense Stationary

In many instances, we encounter random processes that are not stationary in the strict sense that the pdf should remain the same for all time. Instead, only the statistics (usually up to the 2-nd order) are invariant with time. To relax the definition, if the following equations hold, we call the process wide-sense stationary (w. s. s.).

m x = m xn = {xn } = (x n m x )

2 x

xx (n + k , n ) = xx (k ) = xn + k xn

Example of stationary

The following random signal is w.s.s. stationary. As we see, a WSS signal looks more or less the same at different time intervals. Although its detailed form varies, its overall (or macroscopic) shape does not.

Example of nonstationary

An example of a random signal that is not stationary is a seismic wave during an earthquake. As we see, the amplitude of the wave shortly before the beginning of the earthquake is small. At the start of the earthquake the amplitude grows suddenly, sus-tains its amplitude for a certain time, then decays.

Magnitude of the Fourier transform of the previous w.s.s. stationary signal segment. It contains not much information for a random signal by taking its F. T. directly. Later, we will see the power density spectrum, which is better for capturing the frequency domain for random signals.

Time Averages

For any single sample sequence x[n], define their time average to be

1 x[n] = lim x[n] L 2 L + 1 n= L

Similarly, time-average autocorrelation is


L

x[n + m]x[n]

1 = lim x[n + m]x [n] L 2 L + 1 n= L

Ergodic Process

The time average is defined for a deterministic signal sampled from the random process. A stationary random process for which time averages equal ensemble averages is called an ergodic process:

x[n] = m x
x[n + m]x[n]

= xx [m]

Ergodic Process (continue)

It is common to assume that a given sequence is a sample sequence of an ergodic random process, so that averages can be computed from a single sequence.
1 L 1 x = m x[n ] L n =0

In practice, we cannot

compute with the limits, but instead the quantities on the right-hand side

L 1 1 2 x )2 (x[n] m x = L n =0

x[n + m]x [n]

1 L 1 = x[n + m]x [n] L n =0

Properties of w.s.s. correlation and covariance sequences

Definition:

{ [ ] xx m = xn + m xn } { ( )( ) [ ] xx m = xn + m m x xn m x } } xy [m] = {xn + m y n (xn + m m x )(yn m y ) } xy [m] = {

Property 1:

xx [m] = xx [m] mx

2 y

xy [m] = xy [m] mx m

Properties of correlation and covariance sequences (continue)

Property 2:

xx [0] = E xn
2 x

xx [0] = = Variance

[ ]= Mean Squared Value


2

Property 3:

xx [ m]

[m] [m] xx [ m] = xx
= xx

xy [ m]

[m] [m] xy [ m] = xy
= xy

Properties of correlation and covariance sequences (continue)


case

Property 4: similar properties have been shown in the deterministic

xy [m] xx [0] yy [0]


2

xy [m] xx [0] yy [0]


2

The above implies

xx [m] xx [0]

xx [m] xx [0]

Properties of correlation and covariance sequences (continue)

Property 5: (shift invariance)

If

y n = xn n0

yy [m] = xx [m]

yy [m] = xx [m]

Fourier Transform Representation of Random Signals

Since autocorrelation and autocovariance sequences are all (aperiodic) one-dimensional sequences, there Fourier transform exist and are bounded in |w|. Let the Fourier transform of the autocorrelation and autocovariance sequences be

xx [m] xx e jw xx [m] xx
jw

( ) (e )

xy [m] xy e jw xy [m] xy
jw

( ) (e )

Fourier Transform Representation of Random Signals (continue)

Consider the inverse Fourier Transforms:

1 jw jwm xx (e )e dw xx [m] = 2 1 jw jwm xx (e )e dw xx [m] = 2

Fourier Transform Representation of Random Signals (continue)

Consequently,

1 jw xx (e )dw x[n] = xx [0] = 2 1 2 jw ( ) e dw x = xx [0] = xx 2


2

Denote Pxx (w) = xx e to be the power spectral density; power density spectrum (or power spectrum) of the random process x.

( )
jw

Power Density Spectrum

Hence, we have

x[n]

{ }
2

1 = 2

Pxx (w)dw

The total area under power density in [,] is the total energy of the signal. Pxx(w) is always real-valued since xx(n) is conjugate symmetric For real-valued random processes, Pxx(w) = xx(ejw) is both real and even.

Power Spectral Density

In addition, from

x[n]

{ }
2

1 = 2

Pxx (w)dw

Pxx(w) can be treated as the density at the frequency w of the total energy. Integrating all the densities from to then constitutes the total energy of a w.s.s. random signal. This is why Pxx(w) is called power spectral density.

Power spectral density of the previous w.s.s. stationary signal segment. That is, the F. T. of its autocorrelation function. It looks smooth and well behalved than direct F. T.

Mean and Linear System

Consider a linear system with impulse response h[n]. If x[n] is a stationary random signal with mean mx, then the output y[n] is also a stationary random signal with mean my equaling to
k =

m y [n] = {y[n]} =

h[k ] {x[n k ]} = h[k ]mx [n k ]


k = j0 [ ] = h k H e mx

Since the input is stationary, mx[nk] = mx , and consequently,

m y = mx

( )

k =

Stationary and Linear System

If x[n] is a real and stationary random signal, the autocorrelation function of the output process is
yy [n , n + m] = {y[n]y[n + m]}
h[k ]h[r ]x[n k ]x[n + m r ] = k = r =

k =

h[k ] h[r ] {x[n k ]x[n + m r ]}


r =

Since x[n] is stationary , {x[nk]x[n+mr] } depends only on the time difference m+kr.

Stationary and Linear System (continue)

Therefore,

yy [n , n + m]
=
k =

h[k ] h[r ] xx [m + k r ]
r =

= yy [m]

The output power density is also stationary. Generally, for a LTI system having a wide-sense stationary input, the output is also wide-sense stationary.

Power Density Spectrum and Linear System

By substituting l = rk,

yy [m] =
=
where

l =

xx [m l ]h[k ] h[k ]h[l + k ]


k =

l =

xx [m l ]chh (l )
k =

chh [l ] =

h[k ]h[l + k ]

A sequence of the form of chh[l] is called a deterministic autocorrelation sequence.

Power Density Spectrum and Linear System (continue)

A sequence of the form of Chh[l], l = rk,

yy e

( ) = C (e ) (e )
jw jw jw hh xx

where Chh(ejw) is the Fourier transform of chh[l]. For real h, Correlation of a[n]

chh [l ] = h[l ] h[ l ]
2

Thus

C hh e jw = H e jw H e jw
C hh
jw jw

( ) ( ) ( ) (e ) = H (e )

and b[n] is the convolution of a[n[ and b[-n]

Power Density Spectrum and Linear System (continue)

We have the relation of the input and the output power spectrums to be the following:

yy e

( ) = H (e ) (e )
jw jw 2 jw xx

1 jw e dw = total average power of the input x[n] = xx [0] = xx 2 2 1 2 jw jw H e e dw y[n] = yy [0] = xx 2 = total average power of the output
2

{ } { }

( ) ( ) ( )

Power Density Property


yy [0] = average power in wa <| w |< wb )


1 = 2

We have seen that Pxx(w)=xx(ejw) can be viewed as density. Key property: The area over a band of frequencies, wa<|w|<wb, is proportional to the power in the signal in that band. To show this, we can use the linear-system property above. Consider an ideal band-pass filter. Let H(ejw) be the frequency of the ideal band pass filter for the band wa<|w|<wb. Note that |H(ejw)|2 and xx(ejw) are both even functions. Hence,
wa
b

He

( ) (e )
jw 2 jw xx

1 dw + 2

wb
a

He

( ) (e )dw
jw 2 jw xx

White Noise (or White Gaussian Noise)

A white noise signal is a signal for which


2 xx [m] = x [m]

Hence, its samples at different instants of time are uncorrelated.

The power spectrum of a white noise signal is a constant jw 2

xx e

( )=

The concept of white noise is very useful in quantization error analysis.

White Noise (continue)

The average power of a white-noise is therefore 1 1 2 2 jw xx [0] = x dw = x xx e dw = 2 2 White noise is also useful in the representation of random signals whose power spectra are not constant with frequency.

( )

A random signal y[n] with power spectrum yy(ejw) can be assumed to be the output of a linear time-invariant system with a white-noise input.

yy e

( ) = H (e )
jw jw 2

2 x

Cross-correlation

The cross-correlation between input and output of a LTI system: [m] = {x[n]y[n + m]}
xy = x[n] h[k ]x[n + m k ] k =

That is, the cross-correlation between the input output is the convolution of the impulse response with the input autocorrelation sequence.

k =

h[k ] xx [m k ]

Cross-correlation (continue)

By further taking the Fourier transform on both sides of the above equation, we have xy e jw = H e jw xx e jw This result has a useful application when the input is white noise with variance x2.

( ) ( ) ( )

2 h[m], xy [m] = x

2 H e jw xy e jw = x

( )

( )

These equations serve as the bases for estimating the impulse or frequency response of a LTI system if it is possible to observe the output of the system in response to a white-noise input.

Вам также может понравиться