Академический Документы
Профессиональный Документы
Культура Документы
I. INTRODUCTION
2
2
2 y(n) A2
(1)
(2)
k 0
Wn n (0) n (1) n ( N 1)
n 1 (k ) n (k ) 4 yn A2 yn x n k
2
(3)
n (k ) Kx (n k )
1
5(2 N 1) PR
(12)
Where
K 4 y(n) A2 y(n)
2
(4)
Wn 1 Wn 4 y (n) A2 y (n) X n
Wn KX
(5)
(6)
Where
K1,1 sign yn A
K1, 2 2 yn A
yn
yn
yn
yn
(7)
(8)
K 2,1 2sign yn A2 yn
2
(9)
n 1 n e(n) x (n)
(10)
n 1 (k ) n (k ) e(n) x n k
(11)
1
1 0.9Z 1
(13)
H2
1
1 0.9Z 1
(14)
3
Transmitted Signals
Received Signals
5
Imaginary Part
Imaginary Part
1
0.5
0
-0.5
-1
-1
-5
-10
-5
-0.5
0
0.5
1
Real Part
Equalized Signals by CMA(2,1)
Imaginary part
Error (dB)
0.5
0
0
0.5
1
Real Part
Equalized Signals by CMA(1,1)
1
0
-1
-2
-2
-1
0
Real Part
-50
-100
1000
2000
n
3000
Transmitted Signals
Imaginary Part
Imaginary Part
-0.5
-5
-10
-5
-0.5
50
0.5
0
-0.5
-1
-1
0
5
Real Part
Convergence Plot
0
Real Part
0.5
-50
-100
-0.5
0
0.5
Real Part
-150
1000
2000
n
LMS
CMA(1,1)
-2
0
5
10
Real Part
-50
0.5
-0.5
Error (dB)
50
Error (dB)
Imaginary part
Convergence Plot
0
Real Part
0.5
-150
-100
-150
-50
-100
-0.5
4000
50
-1
-1
3000
LMS vs CMA(1,1)
-6
-5
10
-4
-0.5
4000
-0.5
-1
-1
3000
Received Signals
2000
n
0.5
0
0.5
1
Real Part
Equalized Signals by CMA(2,2)
4000
1000
Received Signals
0.5
-1
-1
10
50
Error (dB)
Imaginary part
0
5
Real Part
Convergence Plot
5
Imaginary Part
-5
-10
-5
-0.5
-100
Error (dB)
Imaginary Part
-1
-1
0
0.5
Real Part
1
Imaginary Part
Imaginary Part
Imaginary Part
-0.5
-0.5
Transmitted Signals
Received Signals
-50
-0.5
0.5
10
50
-1
-1
0
5
Real Part
Convergence Plot
-200
1000
2000
n
3000
4000
-250
500
1000
1500
2000
n
2500
3000
3500
4000
4
In the above figures, the blue curve represents the LMS
algorithm while the red curve represents the CMA algorithms.
As it is seen from the figures, LMS algorithm provides less
error for channel 1. Among CMA algorithms, CMA (1, 2) has
better results. Comparison of the least mean square error for
different algorithms for channel 1 is also provided in Table I:
LMS vs CMA(1,2)
50
LMS
CMA(1,2)
0
Error (dB)
-50
TABLE I
COMPARISON OF LEAST MEAN SQUARE ERROR OF DIFFERENT
METHOD FOR CHANNEL 1
Method
LMS
CMA(1,1)
CMA(1,2)
CAM(2,1)
CMA(2,2)
Error
-150
-50
-100
-80
-100
(dB)
-100
-150
-200
-250
500
1000
1500
2000
n
2500
3000
3500
4000
LMS vs CMA(2,1)
Transmitted Signals
50
Error (dB)
-50
10
Imaginary Part
Received Signals
1
Imaginary Part
LMS
CMA(2,1)
0.5
0
-0.5
-1
-1
0
-5
-10
-5
0
0.5
1
Real Part
Equalized Signals by LMS Algorithm
1
-100
-0.5
0
Real Part
Convergence Plot
50
-200
-250
500
1000
1500
2000
n
2500
3000
3500
0
-0.5
-1
-1
4000
0.5
Error (dB)
Imaginary Part
-150
-0.5
0
0.5
Real Part
-50
-100
1000
2000
n
3000
4000
LMS vs CMA(2,2)
Transmitted Signals
50
Error (dB)
-50
10
Imaginary Part
Received Signals
1
Imaginary Part
LMS
CMA(2,2)
0.5
0
-0.5
-1
-1
0
-5
-10
-5
0
0.5
1
Real Part
Equalized Signals by CMA(1,1)
-100
-0.5
0
Real Part
Convergence Plot
50
-200
-250
500
1000
1500
2000
n
2500
3000
3500
4000
0.5
Error (dB)
Imaginary part
-150
0
-0.5
-1
-1
-0.5
0
0.5
Real Part
-50
-100
1000
2000
n
3000
4000
5
Transmitted Signals
0.5
0
-0.5
0
0.5
1
Real Part
Equalized Signals by CMA(1,2)
-50
-5
50
0.5
Error (dB)
-10
-5
-0.5
-0.5
0
Real Part
Convergence Plot
-0.5
0
0.5
Real Part
-150
-250
Transmitted Signals
1000
2000
n
3000
0
0.5
1
Real Part
Equalized Signals by CMA(2,1)
0.5
0
-0.5
-1
-1
-0.5
0
0.5
Real Part
0
Real Part
Convergence Plot
-50
-250
1000
2000
n
3000
-1
-1
0
0.5
1
Real Part
Equalized Signals by CMA(2,2)
Error (dB)
0
-0.5
-1
-1
-0.5
0
0.5
Real Part
500
1000
1500
2000
n
2500
3000
3500
4000
LMS vs CMA(2,1)
LMS
CMA(2,1)
0
-50
-5
0
Real Part
Convergence Plot
100
0.5
LMS
CMA(1,2)
50
-10
-5
-0.5
4000
Error (dB)
Imaginary Part
Imaginary Part
-300
4000
Received Signals
-0.5
3500
-150
-200
10
3000
-100
Transmitted Signals
0.5
2500
-50
2000
n
-5
-100
1500
50
Error (dB)
Imaginary part
1000
LMS vs CMA(1,2)
-10
-5
-0.5
500
50
Error (dB)
Imaginary Part
Imaginary Part
-0.5
Received Signals
-1
-1
Imaginary Part
-300
4000
10
0.5
-150
-200
-100
-50
-100
-1
-1
LMS
CMA(1,1)
Error (dB)
Imaginary Part
Imaginary Part
50
10
-1
-1
Imaginary part
LMS vs CMA(1,1)
Received Signals
-100
-150
-200
-100
-250
-200
1000
2000
n
3000
4000
-300
500
1000
1500
2000
n
2500
3000
3500
4000
6
LMS vs CMA(2,2)
50
LMS
CMA(2,2)
Error (dB)
-50
-100
-150
-200
-250
-300
500
1000
1500
2000
n
2500
3000
3500
4000
IV. CONCLUSION
Blind channel equalization is used whenever there is no
available training data, and we have no knowledge of coming
data. From the results, it can be concluded that CMA(1,1),
CMA(1,2), CMA(2,1), and CMA(2,2) perform slower than
LMS algorithm, for channel 1 by the values of 100, 50, 70, 50
db and for channel 2 by the values of 70, 30, 70, 60 db
respectively. It is also noted that initialization of the CMA
algorithms has significant effect in the convergence rate.
REFERENCES
[1] Manson Hayes, Satistical Digital Signal Processing and Modeling, John
Wiley & Sons. Inc., 1996.
[2] Jorgen Nordberg, Blind Subband Adaptive Equalization, Research
Project, BTH, Sweden, 2002
[3] Dominique Godard, Self-Recovering Equation and Carrier Tracking in
Two Dimensional Data Communication System, IEEE Transaction on
Communications, No.11, Nov 1980
[4] D. L. Jones, A normalized constant-modulus algorithm, Proceedings of
29th Asilomar Conference on Signals, Systems and Computers, Pacic
Grove, CA,USA, vol.2, pp.694-697, October 1999.