Вы находитесь на странице: 1из 10

What is white noise?

In communication systems, the noise is an error or undesired random disturbance of a

useful information signal, introduced before or after the detector and decoder. The noise is a summation of unwanted or disturbing energy from natural and sometimes man-made sources.

Noise is, however, typically distinguished from interference, (e.g. cross-talk, deliberate jamming or other unwanted electromagnetic interference from specific transmitters), for example in the signal-to-noise ratio (SNR), signal-to-interference ratio (SIR) and signal- to-noise plus interference ratio (SNIR) measures. Noise is also typically distinguished from distortion, which is an unwanted alteration of the signal waveform, for example in the signal-to-noise and distortion ratio (SINAD). In a carrier-modulated passband analog communication system, a certain carrier-to-noise ratio (CNR) at the radio receiver input would result in a certain signal-to-noise ratio in the detected message signal. In a digital communications system, a certain Eb/N0 (normalized signal-to-noise ratio) would result in a certain bit error rate (BER).

What is white noise? In communication systems, the noise is an error or undesired random disturbance

Figure 1: Basic communication system with noise

In figure 1 shows a communication system where the channel experiences attenuation, time delay (precisely known) and additive noise. Most disturbances, interference, attenuation, etc. are usually classified as noise. The most important type of noise that occur in communication system is said to be “white noise”, n(t) that will be discuss later.

Noise source

Internal Noise source

Thermal Noise

Shot Noise Partition Noise Flicker Noise Others-Resistors, vacuum tubes, BJT & other solid state devices

External Noise source

Atmospheric noises Man Made

Extraterristrial sources Multiple Transmission Path Random attenuation in transmission medium White Noise

Others

Background Noises Brownian Noise Burst Noise Cosmic Noise

Gaussian Noise Gray Noise Jitter JohnsonNyquist noise Pink Noise Quantization error

Thermal noise

JohnsonNyquist noise (sometimes thermal, Johnson or Nyquist noise) is unavoidable, and generated by the random thermal motion of charge carriers (usually electrons), inside an electrical conductor, which happens regardless of any applied voltage.

Thermal noise is approximately white, meaning that its power spectral density is nearly equal throughout the frequency spectrum. The amplitude of the signal has very nearly a Gaussian probability density function. A communication system affected by thermal noise is often modeled as an additive white Gaussian noise (AWGN) channel.

The root mean square (RMS) voltage due to thermal noise, generated in a resistance R

(ohms) over bandwidth Δf (hertz), is given by

Thermal noise Johnson – Nyquist noise (sometimes thermal, Johnson or Nyquist noise) is unavoidable, and generated

Where

k B is Boltzmann's constant (joules per kelvin) T is the resistor's absolute temperature (kelvin).

Thermal noise Johnson – Nyquist noise (sometimes thermal, Johnson or Nyquist noise) is unavoidable, and generated

Figure 2: Thermal Noise detection

In figure 2 shows a thermal noise detection procedure by taking across the terminal of resistance through a sensitive ac voltmeter or by a oscilloscope.

Thermal noise Johnson – Nyquist noise (sometimes thermal, Johnson or Nyquist noise) is unavoidable, and generated

In the diagram above, power is being transferred from the source, with voltage and

In the diagram above, power is being transferred from the source, with voltage and fixed source

fixed source resistance , to a load with resistance

,
,

resulting

in

a

In the diagram above, power is being transferred from the source, with voltage and fixed source

current .

By Ohm's law,

is simply the source voltage divided by the total circuit resistance:

is simply the source voltage divided by the total circuit resistance:

 
 
In the diagram above, power is being transferred from the source, with voltage and fixed source

The power

The power dissipated in the load is the square of the current multiplied by the

dissipated in the load is the square of the current multiplied by the

resistance:

In the diagram above, power is being transferred from the source, with voltage and fixed source

The value of

In the diagram above, power is being transferred from the source, with voltage and fixed source

for which this expression is a maximum could be calculated by

In the diagram above, power is being transferred from the source, with voltage and fixed source

differentiating it, but it is easier to calculate the value of for which the denominator

In the diagram above, power is being transferred from the source, with voltage and fixed source

is a minimum. The result will be the same in either case.

Differentiating the denominator with respect to

:
:
In the diagram above, power is being transferred from the source, with voltage and fixed source

For a maximum or minimum, the first derivative is zero, so

or

In the diagram above, power is being transferred from the source, with voltage and fixed source
In the diagram above, power is being transferred from the source, with voltage and fixed source
In the diagram above, power is being transferred from the source, with voltage and fixed source

In practical resistive circuits, and are both positive, so the positive sign in the above is the correct solution. To find out whether this solution is a minimum or a maximum, the denominator expression is differentiated again:

In the diagram above, power is being transferred from the source, with voltage and fixed source

This is always positive for positive values of

In the diagram above, power is being transferred from the source, with voltage and fixed source

and

In the diagram above, power is being transferred from the source, with voltage and fixed source

, showing that the denominator

is a minimum, and the power is therefore a maximum, when

In the diagram above, power is being transferred from the source, with voltage and fixed source

A note of caution is in order here. This last statement, as written, implies to many people that for a given load, the source resistance must be set equal to the load resistance for maximum power transfer. However, this equation only applies if the source resistance cannot be adjusted, e.g., with antennas. For any given load resistance a

source resistance of zero is the way to transfer maximum power to the load. As an example, a 100 volt source with an internal resistance of 10 ohms connected to a 10 ohm load will deliver 250 watts to that load. Make the source resistance zero ohms and the load power jumps to 1000 watts.

Application

Transformer Impedance Matching

One very useful application of impedance matching to provide maximum power transfer is in the output stages of amplifier circuits, where the speakers impedance is matched to the amplifier output impedance to obtain maximum sound power output. This is achieved by using a matching transformerto couple the load to the amplifiers output as shown below.

Transformer Coupling

source resistance of zero is the way to transfer maximum power to the load. As an

The maximum power transfer can be obtained even if the output impedance is not the same as the load impedance. This can be done using a suitable "turns ratio" on the transformer with the corresponding ratio of load impedance, Z LOAD to output impedance, Z OUT matches that of the ratio of the transformers primary turns to secondary turns as a resistance on one side of the transformer becomes a different value on the other. If the load impedance, Z LOAD is purely resistive and the source impedance is purely resistive, Z OUT then the equation for finding the maximum power transfer is given as:

source resistance of zero is the way to transfer maximum power to the load. As an

Where: N P is the number of primary turns and N S the number of secondary turns on the transformer. Then by varying the value of the transformers turns ratio the output

impedance can be "matched" to the source impedance to achieve maximum power transfer. For example,

Example No2.

If an loudspeaker is to be connected to an amplifier with an output impedance of 1000Ω, calculate the turns ratio of the matching transformer required to provide maximum power transfer of the audio signal. Assume the amplifier source impedance is Z 1 , the load impedance is Z 2 with the turns ratio given as N.

impedance can be "matched" to the source impedance to achieve maximum power transfer. For example, Example
impedance can be "matched" to the source impedance to achieve maximum power transfer. For example, Example

Generally, small transformers used in low power audio amplifiers are usually regarded as ideal so any losses can be ignored.

Shot noise

Shot noise in electronic devices consists of unavoidable random statistical fluctuations of the electric current in an electrical conductor. Random fluctuations are inherent when a current flow, as the current is a flow of discrete charges (electrons).

Definition-Stationary process

Formally,

let

Definition-Stationary process Formally, let be a stochastic process and let represent , for all at thediscrete-time stationary process where the sample space is also discrete (so that the random variable may take one of N possible values) is a Bernoulli scheme. Other examples of a discrete-time stationary process with continuous sample space include some autore g ressive and avera g e processes which are both subsets of the autoregressive moving average model. Models with a non-trivial autoregressive component may be either stationary or non-stationary, dependin g on the parameter values, and important non-stationary special cases are where unit roots exist in the model. Let Y be any scalar random variable, and define a time-series { X }, by . Then { X } is a stationary time series, for which realisations consist of a series of constant values, with a different constant value for each realisation. A law of large numbers does not apply on this case, as the limiting value of an average from a sin g le realisation takes the random value determined by Y , rather than taking the expected value of Y. As a further example of a stationary process for which an y sin g le realisation has an apparently noise-free structure, let Y have a uniform distribution on (0,2π] and define the time series { X } by Then { X } is strictly stationary. " id="pdf-obj-6-8" src="pdf-obj-6-8.jpg">

be

a stochastic

process and

let

Definition-Stationary process Formally, let be a stochastic process and let represent , for all at thediscrete-time stationary process where the sample space is also discrete (so that the random variable may take one of N possible values) is a Bernoulli scheme. Other examples of a discrete-time stationary process with continuous sample space include some autore g ressive and avera g e processes which are both subsets of the autoregressive moving average model. Models with a non-trivial autoregressive component may be either stationary or non-stationary, dependin g on the parameter values, and important non-stationary special cases are where unit roots exist in the model. Let Y be any scalar random variable, and define a time-series { X }, by . Then { X } is a stationary time series, for which realisations consist of a series of constant values, with a different constant value for each realisation. A law of large numbers does not apply on this case, as the limiting value of an average from a sin g le realisation takes the random value determined by Y , rather than taking the expected value of Y. As a further example of a stationary process for which an y sin g le realisation has an apparently noise-free structure, let Y have a uniform distribution on (0,2π] and define the time series { X } by Then { X } is strictly stationary. " id="pdf-obj-6-18" src="pdf-obj-6-18.jpg">

represent

, for all
, for all

at

the cumulative

distribution

. Then,

function of

the joint

distribution of

times

Definition-Stationary process Formally, let be a stochastic process and let represent , for all at thediscrete-time stationary process where the sample space is also discrete (so that the random variable may take one of N possible values) is a Bernoulli scheme. Other examples of a discrete-time stationary process with continuous sample space include some autore g ressive and avera g e processes which are both subsets of the autoregressive moving average model. Models with a non-trivial autoregressive component may be either stationary or non-stationary, dependin g on the parameter values, and important non-stationary special cases are where unit roots exist in the model. Let Y be any scalar random variable, and define a time-series { X }, by . Then { X } is a stationary time series, for which realisations consist of a series of constant values, with a different constant value for each realisation. A law of large numbers does not apply on this case, as the limiting value of an average from a sin g le realisation takes the random value determined by Y , rather than taking the expected value of Y. As a further example of a stationary process for which an y sin g le realisation has an apparently noise-free structure, let Y have a uniform distribution on (0,2π] and define the time series { X } by Then { X } is strictly stationary. " id="pdf-obj-6-40" src="pdf-obj-6-40.jpg">

all

,

Definition-Stationary process Formally, let be a stochastic process and let represent , for all at thediscrete-time stationary process where the sample space is also discrete (so that the random variable may take one of N possible values) is a Bernoulli scheme. Other examples of a discrete-time stationary process with continuous sample space include some autore g ressive and avera g e processes which are both subsets of the autoregressive moving average model. Models with a non-trivial autoregressive component may be either stationary or non-stationary, dependin g on the parameter values, and important non-stationary special cases are where unit roots exist in the model. Let Y be any scalar random variable, and define a time-series { X }, by . Then { X } is a stationary time series, for which realisations consist of a series of constant values, with a different constant value for each realisation. A law of large numbers does not apply on this case, as the limiting value of an average from a sin g le realisation takes the random value determined by Y , rather than taking the expected value of Y. As a further example of a stationary process for which an y sin g le realisation has an apparently noise-free structure, let Y have a uniform distribution on (0,2π] and define the time series { X } by Then { X } is strictly stationary. " id="pdf-obj-6-46" src="pdf-obj-6-46.jpg">

is said to be stationary if, for all

, and for

Since does not affect , is not a function of time.
Since
does not affect
,
is not a function of time.

Examples

As an example, white noise is stationary. The sound of a cymbal clashing, if hit only once, is not stationary because the acoustic power of the clash (and hence its variance) diminishes with time. However, it would be possible to invent a stochastic process describing when the cymbal is hit, such that the overall response would form a stationary process. An example of a discrete-time stationary process where the sample space is also discrete (so that the random variable may take one of N possible values) is a Bernoulli scheme. Other examples of a discrete-time stationary process with continuous sample space include some autoregressive and average processes which are both subsets of the autoregressive moving average model. Models with a non-trivial autoregressive component may be either stationary or non-stationary, depending on the parameter values, and important non-stationary special cases are where unit roots exist in the model. Let Y be any scalar random variable, and define a time-series { X t }, by

.
.

Then { X t } is a stationary time series, for which realisations consist of a series of constant values, with a different constant value for each realisation. A law of large numbers does not apply on this case, as the limiting value of an average from a single realisation takes the random value determined by Y, rather than taking the expected value of Y.

As a further example of a stationary process for which any single realisation has an apparently noise-free structure, let Y have a uniform distribution on (0,2π] and define the time series { X t } by

Definition-Stationary process Formally, let be a stochastic process and let represent , for all at thediscrete-time stationary process where the sample space is also discrete (so that the random variable may take one of N possible values) is a Bernoulli scheme. Other examples of a discrete-time stationary process with continuous sample space include some autore g ressive and avera g e processes which are both subsets of the autoregressive moving average model. Models with a non-trivial autoregressive component may be either stationary or non-stationary, dependin g on the parameter values, and important non-stationary special cases are where unit roots exist in the model. Let Y be any scalar random variable, and define a time-series { X }, by . Then { X } is a stationary time series, for which realisations consist of a series of constant values, with a different constant value for each realisation. A law of large numbers does not apply on this case, as the limiting value of an average from a sin g le realisation takes the random value determined by Y , rather than taking the expected value of Y. As a further example of a stationary process for which an y sin g le realisation has an apparently noise-free structure, let Y have a uniform distribution on (0,2π] and define the time series { X } by Then { X } is strictly stationary. " id="pdf-obj-6-113" src="pdf-obj-6-113.jpg">

Then { X t } is strictly stationary.

Partition Noise

Statistical fluctuations in the current division or current merger in Vacuum tubes ofin solid state devices.

Flicker noise

Flicker noise, also known as 1/f noise, is a signal or process with a frequency spectrum that falls off steadily into the higher frequencies, with a pink spectrum. It occurs in almost all electronic devices, and results from a variety of effects, though always related to a direct current.

White Noise

White noise can be simply define as that noise, the power spectral density of which is independent of operating frequency. Also we can say, it is a random signal (or process) with a flat power spectral density. In other words, the signal contains equal power within a fixed bandwidth at any center frequency. White noise draws its name from white light in which the power spectral density of the light is distributed over the visible band in such a way that the eye's three color receptors (cones) are approximately equally stimulated. In other sense, the adjective white is used in the sense that white light contains equal amounts of all frequencies within the visible band of electromagnetic radiation. We express the power spectral density of white noise, with a sample function denoted by w(t), as

N 0 is input stage of the receiver of a communication system with unit watts per Hertz.

Make a review of auto correlation & cross correlation.
Make a review of auto correlation & cross correlation.
Make a review of auto correlation & cross correlation.

Make a review of auto correlation & cross correlation.