Вы находитесь на странице: 1из 12

EXPERIMENT 1

AIM:

To plot a Sinc wave


To plot the Fourier Transform of Sinc wave

THEORY: In digital signal processing and information theory, the normalized sinc function is commonly defined by:

In either case, the value at x=0 is defined to be the limiting value:sinc(0)=1.


A sinc function can be obtained by taking Discrete Fourier Transform of a rectangular pulse.

CODE:
t =0:0.5:100;
y=sinc(t);
plot(t,y)

OUTPUT:

CODE:
t=0:0.5:100;
y=sinc(t);
%plot (t, y);
q=fftshift(real(fft(y)));
plot(q)

OUTPUT:

EXPERIMENT 2.1
AIM: To verify the convolution property of Discrete Fourier Transform (DFT).
THEORY: In mathematics, the discrete Fourier transform (DFT) converts a finite list of equally spaced samples of
a function into the list of coefficients of a finite combination of complex sinusoids, ordered by their frequencies, that has
those same sample values. It can be said to convert the sampled function from its original domain (often time or position
along a line) to the frequency domain.
The convolution theorem for the discrete-time Fourier transform indicates that a convolution of two infinite sequences can
be obtained as the inverse transform of the product of the individual transforms. An important simplification occurs when
the sequences are of finite length, N. In terms of the DFT and inverse DFT, it can be written as follows:

which is the convolution of the

sequence with a

sequence.

CODE:
x=[ones(1,3) zeros(1,2)];
y=[zeros(1,4) ones(1,1)];
%h=transpose(x)
w=conv(x,y)
x1=[x zeros(1,4)]
y1=[y zeros(1,4)]
s=fft(y1);
n=fft(x1);
q= s.*n;
r=ifft(q)

OUTPUT:
w=
0

x1 =

y1 =

r=
-0.0000

0.0000

1.0000

1.0000

1.0000

0.0000

EXPERIMENT 2.2
AIM: To verify the time shifting property of Discrete Fourier Transform (DFT).
THEORY: The Discrete Fourier Transform(DFT) is the equivalent of the continuous Fourier Transform for signals
known only at N instants separated by sample times T(i.e. a finite sequence of data).According to the time shifting
property of DFT, if h(k) is shifted by integer i then,

h(k-i) => H(n).exp(-i*j*(2*pi/P)*n)


where, P represents the P-point DFT of h(k).

CODE:
x=[zeros(1,3) ones(1,3)]
s= fft(x);
P=6;
for n=1:P
w=exp(2*2*pi*i*(n-1)/P);
x2(n)=w;
end
l= s.*x2;
q= ifft(l)

OUTPUT:
x=
0

q=
Columns 1 through 5
-0.0000 + 0.0000i
Column 6
0.0000 - 0.0000i

1.0000 - 0.0000i

1.0000 + 0.0000i

1.0000 - 0.0000i

0.0000 + 0.0000i

EXPERIMENT 2.3
AIM: To verify the frequency shifting property of Discrete Fourier Transform (DFT).
THEORY: The Discrete Fourier Transform(DFT) is the equivalent of the continuous Fourier Transform for signals
known only at N instants separated by sample times T(i.e. a finite sequence of data).According to the time shifting
property of DFT, if h(k) is shifted by integer i then,

h(k-i) => H(n).exp(-i*j*(2*pi/P)*n)


where, P represents the P-point DFT of h(k).

CODE:
x=[zeros(1,3) ones(1,3)];
s=fft(x)
P=6;
for n=1:P
w=exp(2*2*pi*i*(n-1)/P);
x2(n)=w;
end
l= x.*x2;
u=fft(l)

OUTPUT:
s=
Columns 1 through 5
3.0000

-1.0000 + 1.7321i

-1.0000

Column 6
-1.0000 - 1.7321i

u=
Columns 1 through 5
0.0000 + 0.0000i -1.0000 - 1.7321i
Column 6
-1.0000 - 0.0000i

3.0000 - 0.0000i -1.0000 + 1.7321i -0.0000 + 0.0000i

EXPERIMENT 3.1
AIM: To plot and verify the PDF of Gaussian Random Numbers.
THEORY: In probability theory, the normal (or Gaussian) distribution is a very commonly occurring continuous
probability distributiona function that tells the probability that an observation in some context will fall between any
two real numbers. Normal distributions are extremely important in statistics and are often used in the natural and social
sciences for real-valued random variables whose distributions are not known. A normal distribution is,

CODE:
a=1
b=2
c=a+(b-a)*randn(1,50000)
hist(c)

OUTPUT:

EXPERIMENT 3.2
AIM: To plot and verify the PDF of Uniform Random Numbers.
THEORY: In probability theory and statistics, the uniform distribution or rectangular distribution is a family
of symmetric probability distributions such that for each member of the family, all intervals of the same length on the
distribution's support are equally probable. The support is defined by the two parameters, a and b, which are its minimum
and maximum values. The distribution is often abbreviated U(a,b). The probability density function of the uniform
distribution is:

CODE:
a=1
b=2
c=a+(b-a)*rand(1,5000)
hist(c)

OUTPUT:

EXPERIMENT 3.3
AIM: To plot and verify the PDF of Speech samples.
THEORY:
y= wavread(filename) loads a WAVE file specified by the string filename, returning the sampled data in y. If filename
does not include an extension, wavread appends .wav.

CODE:
P=wavread(bbm_tone.wav);
Hist(P)

OUTPUT:

EXPERIMENT 4
AIM: Compute the mean and variance of given Random Variables.
THEORY: Let X1, ..., Xn denote random variables from the n component distributions, and let X denote a random
variable from the mixture distribution. Then, for any function H() for which
component densities pi(x) exist,

exists, and assuming that the

The relation,

holds more generally.


It is a trivial matter to note that the jth moment about zero (i.e. choosing H(x) = xj) is simply a weighted average of
the jth moments of the components. Moments about the mean H(x) = (x )j involve a binomial expansion:

where i denotes the mean of the ith component.


In case of a mixture of one-dimensional normal distributions with weights wi, means i and variances i2, the total mean
and variance will be:

CODE:
x1=randn(1,100);
%hist(x1)
x2=0;
for i=1:100
x2=[x1(i)^2]+x2;
end
x2=x2/100;
y=var(x1)
y1=mean(x1)
y2=x2-[y1^2]

OUTPUT:
y=
0.8081

y1 =
0.2460

y2 =
0.8000

EXPERIMENT 5
AIM:

To plot the pdf of the sum of two Random Variables, and verify the result.
To verify that the sum of 2 Gaussian Random Variables is also Gaussian.
To verify the Central Limit Theorem.

THEORY:
a) It is known that the pdf of the sum of 2 independent variables is the convolution of the individual pdfs. Let X, Y
be 2 independent random variables with pdf fx(a) and fy(a) respectively. We define a random variable Z as:
Z= X+Y
then
fz(a)=fx(a)*fy(a)
b) It is known that the pdf of the sum of 2 independent Gaussian random variables is also Gaussian. Let X, Y be 2
independent Gaussian random variables with mean x, y and variance x2, y2 respectively. We define a random
variable Z as:
Z= pX+qY
then
Z: GAUSSIAN(px+ qy , p2 x2 + q2 y2 )
c) Let Xi for i=1,2..,N be N zero mean, independent identically distributed random variables. The central limit
theorem states that the random variable Y, defined as:

tends to have a Gaussian density function.

CODE:
m=20;
x1=rand(1,100000);
x2=rand(1,100000);
y=x1+x2;
hist(y,m)

OUTPUT:

CODE:
m=20;
x1=randn(1,100000);
x2=randn(1,100000);
y=x1+x2;
hist(y,m)

OUTPUT:

CODE:
m=20;
x1=rand(1,100000);
x2=randn(1,100000);
y=x1+x2;
hist(y,m)

OUTPUT:

Вам также может понравиться