Академический Документы
Профессиональный Документы
Культура Документы
國立台灣海洋大學 國立台灣海洋大學
National Taiwan Ocean University National Taiwan Ocean University
Random signals in one form or another are encountered 8.1 Probability and Random Variables
in every practical communication system. Probability theory is rooted in situations that involve
Noise may be defined as any unwanted signal performing an experiment with an outcome that is
interfering with or distorting the signal being subject to chance.
communicated. If the experiment is repeated, the outcome may differ
Noise is another example of a random signal. due to the influence of an underlying random
phenomenon.
Such an experiment is referred to as a random
experiment.
國立台灣海洋大學 國立台灣海洋大學
National Taiwan Ocean University National Taiwan Ocean University
國立台灣海洋大學 國立台灣海洋大學
National Taiwan Ocean University National Taiwan Ocean University Random Variables
If the outcome of the experiment is s, we denote the
random variable as X(s) or just X.
Note that X is a function, even if it is, for historical
reasons, called a random variable.
We denote a particular outcome of a random
experiment by x; that is, X(sk) = x. There may be more
than one random variable associated with the same
random experiment.
The concept of a random variable is shown in Fig. 8.3.
Figure 8.3 Relationship between sample space, random variables, and probability.
國立台灣海洋大學 國立台灣海洋大學
National Taiwan Ocean University National Taiwan Ocean University
For the coin-tossing experiment, if it is affair coin, the Example 8.1 Bernoulli Random Variable
probability mass function of the associated random
variable may be written as 1 − p, x = 0
P[ X = x] = p, x =1
12 , x=0
1 0, otherwise
P[ X = x] = 2 , x =1
0, otherwise
Compare with the previous coin-tossing experiment,
The probability mass function is illustrated in Fig. 8.4. can you find the difference?
國立台灣海洋大學 國立台灣海洋大學
National Taiwan Ocean University National Taiwan Ocean University
Figure 8.5 The uniform distribution. (a) The probability density function.
(b) The distribution function.
國立台灣海洋大學 國立台灣海洋大學
National Taiwan Ocean University National Taiwan Ocean University
Several Random Variables We call the function fX,Y(x, y) the joint probability
Consider two random variables X and Y. We define the density function of the random variables X and Y.
joint distribution function FX, Y(x, y) as the probability FX ( x) =
∞ x
國立台灣海洋大學 國立台灣海洋大學
National Taiwan Ocean University National Taiwan Ocean University
Example 8.4 Binomial Random Variable Example 8.5 Binomial Distribution Function
y y
N
N FY ( y ) = P[Y ≤ y ] = P[Y = k ] = p k (1 − p ) N − k
N
Y = Xn P[Y = y ] = p y (1 − p ) N − y k =0 k
y
k =0
n =1
N
N
N N! FY ( N ) = p k (1 − p ) N − k = [ p + (1 − p )]N = 1
where = k =0 k
y y!( N − y )!
[p + (1 – p)]N
國立台灣海洋大學 國立台灣海洋大學
National Taiwan Ocean University National Taiwan Ocean University
8.2 Expectation
Example 8.6 Binary Symmetric Channel Mean
P[Y = 0] = P[Y = 0 | X = 0]P[ X = 0] + P[Y = 0 | X = 1]P[ X = 1] These statistical averages or expectations are denoted
= (1 − p ) p0 + pp1 by, E[g(X)] for the expected value of a function g(·) of
P[Y = 1] = P[Y = 1 | X = 0]P[ X = 0] + P[Y = 1 | X = 1]P[ X = 1] the random variable X.
= pp0 + (1 − p ) p1 For a discrete random variable X, the mean, μX, is the
P[Y = 0 | X = 0]P[ X = 0] weighted sum of the possible outcomes
P[ X = 0 | Y = 0] =
P[Y = 0]
μX = E[X] = ΣX xP[X = x]
(1 − p ) p0
=
(1 − p) p0 + pp1 ∞
E[ X ] = xf X ( x) dx
P[Y = 1 | X = 1]P[ X = 1] −∞
P[ X = 1 | Y = 1] = N
P[Y = 1] 1
(1 − p ) p1
μˆ X =
N
x
n =1
n
= Figure 8.8 Transition probability
pp0 + (1 − p) p1 diagram of binary symmetric channel.
國立台灣海洋大學 國立台灣海洋大學
National Taiwan Ocean University National Taiwan Ocean University
Variance ∞
σ X2 = ( x − μ X ) 2 f X ( x)dx
The variance of a random variable is an estimate of the −∞
國立台灣海洋大學 國立台灣海洋大學
National Taiwan Ocean University Covariance National Taiwan Ocean University
The covariance of two random variables, X and Y, is 8.3 Transformation of Random Variables
given by the expected value of the product of the two That a random variable X with distribution FX(x) is
random variables, transformed to Y = aX + b.
Cov(X, Y) = E[(X – μX)(Y – μY)] Consider the probability that X belongs to the set A
Cov(X, Y) = E[XY] – μXμY where A is a subset the real line. If X ∈ A, then it
If the two r.v.’s are continuous with joint density, follows that Y ∈ B where B is defined by B = aA + b.
fX,Y(x,y), then E[ XY ] = ∞ ∞ xyf ( x, y )dxdy P[X ∈ A] = P[X ∈ B]
−∞ −∞
X ,Y
If the two r.v.’s happen to be independent, then FY(y) = P[Y∈(–∞, y)] = P[X∈(–∞, (y–b)/a]
E[ XY ] =
∞ ∞
y −b
− ∞ −∞ xyf X ( x) f Y ( y )dxdy FY ( y ) = FX
a
∞ ∞
= xf X ( x) dx yfY ( y )dy
−∞ −∞
= E[ X ]E[Y ]
國立台灣海洋大學 國立台灣海洋大學
National Taiwan Ocean University National Taiwan Ocean University
國立台灣海洋大學 國立台灣海洋大學
National Taiwan Ocean University National Taiwan Ocean University
8.4 Gaussian Random Variables The distribution function of this normalized Gaussian
A Gaussian random variables is a continuous random random variable is given by the integral of this
variable with a density function given by function x 1 x s2
FX ( x) = f X ( s )ds =
2π −∞
exp − ds
1 ( x − μ X )2 −∞
2
f X ( x) = exp −
2π σ X 2σ X2 The related function, often used in the communications
where the Gaussian random variable X has mean μX and context, is the Q-function,
variance σX2. 1 ∞ s2
1 x2
Q( x) =
2π x
exp− ds = 1 − FX ( x)
2
f X ( x) = exp − , −∞ < x < ∞
2π 2 The last line of Eq. (8.51) indicates that the Q-function
is the complement of the normalized Gaussian
distribution function.
The Q-function is plotted in Fig. 8.11.
國立台灣海洋大學 國立台灣海洋大學
National Taiwan Ocean University National Taiwan Ocean University
國立台灣海洋大學
National Taiwan Ocean University
國立台灣海洋大學
National Taiwan Ocean University
8.5 The Central Limit Theorem
Example 8.9 Probability of Bit Error with PAM An important result in probability theory that is closely
Y=A+N related to the Gaussian distribution is the central limit
( y − A) 2 theorem.
0 1 2
1 s
P[Y < 0] =
z
exp− dy FZ ( z ) → exp− ds
−∞
2π σ 2σ 2
−∞
2π 2
y − A
s = − This is a mathematical statement of the central limit
σ theorem.
1 ∞ s2
P[Y < 0] =
2π A /σ − 2 ds
exp In words, the normalized distribution of the sum of
independent, identically distributed random variables
A approaches a Gaussian distribution as the number of
= Q
σ random variables increases, regardless of the
individual distributions.
Figure 8.12 Density function of noisy PAM signal Y.
國立台灣海洋大學 國立台灣海洋大學
National Taiwan Ocean University National Taiwan Ocean University
國立台灣海洋大學 國立台灣海洋大學
National Taiwan Ocean University National Taiwan Ocean University
Figure 8.14 Illustration of the relationship between sample space and the
ensemble of sample functions.
國立台灣海洋大學 國立台灣海洋大學
National Taiwan Ocean University National Taiwan Ocean University
Suppose the same random process is observed at time 8.7 Correlation of Random Processes
t1 +τ, and the corresponding distribution function is The covariance of the two random variables X(t1) and
FX(t1+τ)(x). Then if X(t2) is given by
FX(t1+τ)(x) = FX(t1)(x) Cov(X(t1), X(t2)) = E[(X(t1)X(t2)] – μX(t1)μX(t2)
for all t1 and all τ, we say the process is stationary to RX(t, s) = E[(X(t)X*(s)]
the first order.
A first-order stationary random process has a We define the first term on the right-and side of Eq.
distribution function that is independent of time. (8.64) as the autocorrelation of the random process and
∞ ∞
use the generic notation
μ X = sf X (t ) ( s )ds = sf X (t +τ ) ( s )ds RX(t, s) = E[(X(t)X*(s)] = RX(t – s)
−∞ 1 −∞ 1
Stationarity in the second order also implies the mean If a random process has these two properties, then we
of the random process is constant. If this mean is zero, say it is wide-sense stationary or weakly stationary.
then the autocorrelation and covariance functions of a
random process are equivalent.
1. The mean of the random process is a constant
If a random process has these two properties, then we independent of time: E[X(t)] =μX for all t.
say it is wide-sense stationary or weakly stationary.
2. The autocorrelation of the random process only depends
1. The mean of the random process is a constant upon the time difference:
independent of time: E[X(t)] =μX for all t.
E[X(t)X *(t-τ)] = RX(τ), for all t and τ.
2. The autocorrelation of the random process only depends
upon the time difference:
E[X(t)X*(t -τ)] = RX(τ), for all t and τ.
國立台灣海洋大學 國立台灣海洋大學
National Taiwan Ocean University National Taiwan Ocean University
N
1
2
E[ X (t k )] =
N
x j =1
2
j (t k )
國立台灣海洋大學 國立台灣海洋大學
National Taiwan Ocean University National Taiwan Ocean University
The time average of a continuous sample function If we assume that the real-valued random process is
drawn from a real-valued process is given by ergodic, then we can express the autocorrelation
1 T
function as
ε [ x] = lim
T →∞ 2T
−T
x(t )dt
RX (τ ) = E[ X (t ) X (t − τ )]
and the time-autocorrelation of the sample function 1 T
is given by
= lim
T → ∞ 2T −T
x(t ) x(t − τ )dt
1 T
RX (τ ) = lim x(t ) x(t − τ )dt
T →∞ 2T −T
An estimate of the autocorrelation of a real-valued
process for lag τ = τ0 is
In most physical applications, wide-sense stationary N
1
processes are assumed to be ergodic, in which case Rˆ X (τ 0 ) =
N
x(t ) x(t
n =1
n n −τ 0 )
time averages and expectations can be used
interchangeably.
國立台灣海洋大學 國立台灣海洋大學
National Taiwan Ocean University National Taiwan Ocean University
國立台灣海洋大學
National Taiwan Ocean University
8.8 Spectra of Random Signals 國立台灣海洋大學
National Taiwan Ocean University
The Fourier transform has converted a family of random samples of a function x(t) at t = nTs, then the discrete
variables X(t). indexed by parameter t to a new family of Fourier transform is defined as
N −1
random variables ξT(f) indexed by parameter f. ξ k = xnW kn
n =0
國立台灣海洋大學 國立台灣海洋大學
National Taiwan Ocean University National Taiwan Ocean University
國立台灣海洋大學 國立台灣海洋大學
National Taiwan Ocean University National Taiwan Ocean University
8.10 White Noise Example 8.13 Ideal Low-Pass Filtered White Noise
The power spectral density of white noise is independent N0
, | f |< B
of frequency. SN ( f ) = 2
0, | f |> B
N
SW ( f ) = 0 B N0
2 RN (τ ) = exp( j 2πf cτ )df = N 0 Bsinc(2 Bτ )
−B 2
國立台灣海洋大學 國立台灣海洋大學
National Taiwan Ocean University National Taiwan Ocean University
Noise-Equivalent Bandwidth
Suppose a white noise source with spectrum SW(f) =
N0/2 is connected to the input of an arbitrary filter of
transfer function H(f).
Figure 8.23 Characteristics of
ideal band-pass filtered white
noise. (a) Power spectral density.
(b) Autocorrelation function. (c)
Power spectral density of in-
phase and quadrature
components.
國立台灣海洋大學 國立台灣海洋大學
National Taiwan Ocean University National Taiwan Ocean University
BN =
0
| H ( f ) |2 df
| H ( f c ) |2
PN = N 0 | H ( f c ) |2 BN