Вы находитесь на странице: 1из 15

Wireless Pers Commun (2013) 69:17191733

DOI 10.1007/s11277-012-0659-6

A Comparative Study of Different Entropies


for Spectrum Sensing Techniques
Wanjing Zhu Jianguo Ma Oliver Faust

Published online: 19 May 2012


Springer Science+Business Media, LLC. 2012

Abstract In this paper, we propose entropy-based spectrum sensing schemes to detect


the existence of a primary user in Cognitive Radio (CR). To support this proposal, we have
studied four types of entropies [Approximate Entropy (ApEn), Bispectral Entropy (BispEn),
Sample Entropy (SamEn) and Rnyi Entropy (RenyiEn)] and their applications for spectrum
sensing. The reason for investigating these entropies comes from the fact that different types
of entropies have different characteristics which make them more or less suitable for specific
applications. Monte Carlo simulations were executed to find out how suitable these measures
are for CR. Averaged value curves, boxplots and Analysis of Variance (ANOVA) tests document the performance of these types of entropies. The results show that BispEn outperformed
the other three entropy measures. The ANOVA test shows that BispEn can sense modulated
signals when the Signal to Noise Ratio (SNR) is as low as 15 dB. This is at least a 5 dB
improvement compared to the other entropies studied in this paper.
Keywords

Entropy Cognitive radio Spectrum sensing

Abbreviations
AWGN Additive White Gaussian Noise
ANOVA Analysis of Variance

W. Zhu (B)
School of Electronic Engineering, University of Electronic Science and Technology of China,
Chengdu, China
e-mail: wjzhu@uestc.edu.cn
J. Ma
School of Electronic Information Engineering, Tianjin University, Tianjin, China
e-mail: majg@tju.edu.cn
O. Faust
School of Engineering, Ngee Ann Polytechnic, Singapore, Singapore
e-mail: fol2@np.edu.sg

123

1720

ApEn
ASK
BispEn
CR
EEG
FCC
HRV
RenyiEn
RRC
SamEn
SNR
OFDM

W. Zhu et al.

Approximate Entropy
Amplitude-shift keying
Bispectral Entropy
Cognitive Radio
Electroencephalography
Federal Communications Commission
Heart Rate Variability
Rnyi Entropy
Root-raised Cosine
Sample Entropy
Signal to Noise Ratio
Orthogonal Frequency-division Multiplexing

1 Introduction
In nature, the spectrum, used for wireless communication, is a limited resource. Transmitters which use a specific spectrum band need to be licensed by governments. Most of this
precious resource is licensed to a single user with a static control scheme. Rapid growth of
communication services and prosperous communication markets demands a high level of
spectrum efficiency. Haykin has shown that this requirement cannot be satisfied with a static
frequency allocation policy [1].
Cognitive Radio (CR) is a possible solution to the problem of spectrum shortage [2],
because it uses dynamic spectrum access techniques. The parameters of CR are reconfigurable and they can be set such that they adapt to the surrounding radio scenario [3], therefore,
it is possible to increase the spectrum resource efficiency in both time and space domains. In
CR terminology, primary users have higher priority or legacy rights on the usage of a specific
spectrum band, while secondary users have lower priority and they need to exploit the spectrum in a way which does not interfere with primary users [4]. Hence, secondary users have to
sense the spectrum in order to detect whether or not a particular band is occupied by primary
users. In order to know the surrounding radio scenario, CR systems have one important task
which is called spectrum sensing. To be specific, spectrum sensing means the detection of
unoccupied bands, the so called spectrum holes, that can be used for data transmission [5].
One of challenges in spectrum sensing arises from the hidden primary user problem [6].
Figure 1 shows a scenario where the hidden primary user problem occurs. The CR system
cannot detect the presence of a signal which was transmitted by a primary user. The dashed
circles indicate the transmission area. When the primary user receiver is in the inter section, the cognitive radio device cannot detect the existence of the primary user transmitter
and therefore will cause interference to the primary user receiver. Secondary users, such as
CRs, should have a sensitive detection method to achieve high performance spectrum sensing
which can sense the modulated signal in low Signal to Noise Ratio (SNR). Having this ability
means that the detection area of the CR device is expanded. Hence, there is less possibility
for the hidden primary user problem to occur. Nagaraj proposed an entropy-based algorithm
for spectrum sensing when the noise and interference level is unknown [7]. And it has a
better performance than a cyclostationarity based detector. Another entropy-based detector
in frequency domain was introduced [8,9]. It can detect the primary user in a lower SNR
situation than the scheme proposed in reference [7]. The results presented in these papers
show that entropy is an effective feature for spectrum sensing. It is useful to investigate
different entropies and their performance in spectrum sensing.

123

A Comparative Study of Different Entropies

1721

Fig. 1 Hidden primary user scenario

In this paper, we study approximate entropy, sample entropy, Rnyi entropy and bispectral entropy and analyze their performance to detect the presence of modulated signals in
low SNR signals. Additive White Gaussian Noise (AWGN) is selected to model background
noise for this spectrum sensing application and Amplitude-shift keying (ASK) [10] is chosen
as the modulation scheme for the primary users. Analysis of Variance (ANOVA) [11] tests
are used to evaluate the performance of the four different entropies. Our results indicate
that bispectral entropy outperforms approximate, sample and Rnyi entropy. We support this
claim with appropriate graphs and tables. The Bispectral Entropy (BispEn) can detect the
modulated signal when the SNR is as low as 15 dB.
Section 2 introduces the methods we used in this paper. A MATLAB [12] simulation gives
us the data series of different SNRs. The detailed setup of the simulation is also stated in this
part. Then we introduce the definition of the entropies used in this paper. A brief introduction
of ANOVA is given in this part as well. Thus, the simulation results are demonstrated in
Sects. 3 and 4 discusses these results. This paper concludes with Sect. 5.

2 Methods Used
This section illustrates the simulation setup in detail. Monte Carlo simulations [13] in MATLAB generate groups of data series for spectrum sensing. A brief introduction of the spectrum
sensing model is given. We define the spectrum sensing task as identifying whether or not
the received signal contains modulated signal. After that, the four types of entropies, used
in this paper, are presented as well as their mathematical representations. Then we introduce
both boxplot and ANOVA, methods which were used for entropy data analysis.
The mathematical notation, which describes the signals in this paper, starts with defining
the received wireless communication signal. We assume that the received signal is a time

123

1722

W. Zhu et al.

Fig. 2 Simulation process

discrete sequence y[n] where n is the discrete time variable which ranges from 1 to N and
N is the length of the sequence. This signal is partitioned into individual sample sequences.
These sample sequences are modeled as m-dimensional vectors xim :
xim = (y[i], y[i + 1], . . . , y[i + m 1])

(1)

where i = 1, 2, . . . , N m + 1.
The sample sequences xim have been used for the calculation of Approximate Entropy
(ApEn) and Sample Entropy (SamEn).
2.1 Simulation Setup
Figure 2 shows the simulation procedure. In this simulation, the primary users employed ASK
as modulation scheme to create the transmission signal. Random binary data was chosen as
the source information and Root-raised Cosine (RRC) filter was used to reduce the occupied
bandwidth. AWGN channels were used to simulate a transmission environment and different
SNR values were set to get different groups of received signals. The power of the modulated
signal with noise was normalized. All the data series was produced through simulations with
MATLAB. Four groups of entropy values, one group for each of the four entropy measures
introduced in Sect. 2.3, were calculated from the time series. Then the entropy values were
analyzed with boxplots and ANOVA tests.
The frequency values in the simulation were normalized to the symbol rate ( f s ):
f
f =
fs

(2)

The carrier frequency of each modulation scheme was 5 and the sample rate was 200.
Every stream of the modulation signal contained 200 symbols, which meant the data series
contains 40,000 samples.
2.2 Spectrum Sensing Model
When the primary user is transmitting, the received signal for the CR is modeled as:
y(t) = s(t) + w(t)

(3)

In Eq. 3, s(t) is the modulated signal transmitted by the primary user while w(t) is the
noise from the channel. The following equation defines the discrete-time representation of
this relationship:

123

A Comparative Study of Different Entropies

1723

y[n] = s[n] + w[n],

n = 1, 2, . . . , N

(4)

Mathematically, the spectrum sensing problem can be presented as the following two
hypotheses:
H0 : y[n] = w[n]
H1 : y[n] = s[n] + w[n]

(5)

The null hypothesis H0 means that there is no signal transmitted by primary users while
H1 denotes the presence of the primary signal.
2.3 Entropies
Entropy is widely used for measuring randomness, uncertainty and diversity of a signal.
There are different types of entropies for the time series data. The modulation signal contains predictable structures which allow the receiver to access the transmitted information.
Therefore, the entropy values of white Gaussian noise would be larger than the modulated
signal.

2.3.1 Approximate Entropy (ApEn)


ApEn was proposed by Pincus as a measurement of systems complexity [14]. It is a quantitative regularity statistic which measures the predictability of time series fluctuations. We
start the mathematical discussion of ApEn by defining:
 N m+1
Cim (r )

j=1

(i, j, m, r )

N m+1

(6)

(i, j, m, r ) is defined below:



(i, j, m, r ) =

1,
0,

if  xim xmj m < r


otherwise

(7)

where {xim } is a vector which represents a sample sequence of the received signal, as shown
in Eq. 1.
Let us define m (r ) as:
 N m+1
ln Cim (r )
m
(r ) = i=1
(8)
N m+1
Finally, the definition of ApEn follows as:
ApEn(m, r, N ) = m (r ) m+1 (r )

(9)

If the time series is a predictable, the past and current value will enable us to predict
the future value and the ApEn will decrease [15]. Due to the structured patterns introduced
by modulation schemes, wireless communication signals are, to some extent, predictable,
hence ApEn can be used as feature for signal detection in spectrum sensing. It has been
used in biomedical engineering to identify abnormalities in biomedical signals, such as
Electroencephalography (EEG) [16].

123

1724

W. Zhu et al.

2.3.2 Sample Entropy (SamEn)


SamEn is the negative natural logarithm of the conditional probability that two sequences
which are similar for m points remain similar at next point [17]. It has the advantage that
the prediction accuracy does not depend on the time series length. Similar to the discussion
of ApEn, we define preliminary functions before we combine them to form the definition of
SamEn. We start with Bim (r ):
Bim (r ) =

N
m
1
(i, j, m, r )
N m1

(10)

N
m
1
Bim (i, j, m, r )
N m

(11)

j=1

where is defined in Eq. (7).


B m (r ) is defined as below [17]:
B m (r ) =

i=1

And Am (r ) is the B m (r ) value of {xrm+1 }. The equation for SamEn is:


 m 
A (r )
SampEn(m, r, N ) = ln
B m (r )

(12)

SamEn is used as a measure of signal complexity. Al-Angari and Sahakian used it to


evaluate the behavior of Heart Rate Variability (HRV) in obstructive sleep apnoea syndrome
[18]. SamEn-based scheme had a accuracy of 70.3 % in a minute-by minute classification
and was proved to be a useful tool to detect apnea episodes during sleep.
2.3.3 Rnyi Entropy (RenyiEn)
RenyiEn is a generalization of Shannon entropy named after Alfrd Rnyi [19]. RenyiEn
extends Shannon Entropy to a family of entropy measures, which can have is flexible in its
applications.
The definition of the Rnyi entropy is:



1
RenyiEnq =
log2
(13)
[ pr (i)]q
1q
i

where pr (i) represents the probabilities of {xi }, q > 0 and q  = 1. Rnyi entropy is a
generalized version of Shannon entropy. When the q value equals to 1, the Rnyi entropy is
the same as the Shannon entropy. In this paper, we choose the q value as 2.5.
RenyiEn values reflect difference in modulation schemes, therefore Tabatabaei et al. have
used these values as one of four features for modulation recognition [20].
2.3.4 Bispectral Entropy (BispEn)
BispEn is a kind of spectral entropy which is derived from the so called bispectrum. The
bispectrum is a statistic used to search for nonlinear interactions.
We start the discussion of BispEn by defining bispectrum [21]:


B( f 1 , f 2 ) = E X ( f 1 )X ( f 2 )X ( f 1 + f 2 )
(14)
where E indicates mathematical expectation.

123

A Comparative Study of Different Entropies


Fig. 3 Region of

1725
0.6

0.4

0.2

0
0

0.2

0.4

0.6

0.8

1.0

The BispEn is [22]:



pi log pi
BispEn =

(15)

Where the pi is defined as follows:


|B( f 1 , f 2 )|
pi = 
|B( f 1 , f 2 )|

(16)

Figure 3 shows the region, which is the area where f 1 > 0, f 2 > 0, f 1 > f 2 and
f 1 + f 2 < 1.
BispEn is used for vehicle detection [23]. It is an effective method for vehicle detection
based on acoustic signal. The bispectral entropy detector can detect vehicles which are more
than 1,000 m away, but a time domain detector can only do it when the vehicle is just 200 m
away and frequency domain detector 500 m away. The results about the application of BispEn
in spectrum sensing will be shown in Sect. 3.
2.4 Boxplots
Boxplot are an excellent way to study data characteristics, because it depicts the underlining statistical distribution of the data. The method was proposed by Tukey [24]. In a basic
boxplot, there are five parameters which visualize the data statistics: 0.25 quartile, median,
0.75 quartile, minimum value and maximum value. These parameters characterize the statistical distribution of the data. Figure 4 shows an example boxplot. The boxplots of different
entropies will be shown in Sect. 4.
2.5 Analysis of Variance (ANOVA)
Analysis of Variance (ANOVA) is a statistical method to test whether or not the means of
several groups are all equal. It is a formal analysis of variance proposed by Fisher [25]. The
null hypothesis is that the means of all different testing groups are statistically the same.
ANOVA tests give us a value which indicates whether or not the difference of the means is
statistically significant. When the p value is small, the null hypothesis is rejected, that means
the different testing groups have different mean values.
In this paper, ANOVA tests have been used to evaluate the effectiveness of the four different entropy measures for spectrum sensing. From the p values, we can identify which

123

1726
Fig. 4 Boxplot example

W. Zhu et al.
1

0.8

0.6

Maximum
0.75 quartile

0.4

Median
0.25 quartile
Minimum

0.2

type of entropy has the best performance. This will help us to build better spectrum sensing
algorithms.

3 Simulation Results
This section presents the performance results of the four different entropy measures. The
time series data, which models the received signal for Cognitive Radio (CR), comes from
the simulation and all the detailed information about them is stated in Sect. 2.1. The entropy
values were calculated from the time series in MATLAB. We present the values of all four
types entropies introduced in Sect. 2.3. To present the results, we have plotted the averaged
entropy value curves and boxplots. These plots show the performance of different entropies for spectrum sensing. The ANOVA test results show the significance of the difference
between signals with low SNR and pure white noise.
3.1 Entropy Values
Through Monte Carlo simulations, we extracted the individual entropy values from the signals. Table 1 shows means and standard deviations for each set of 200 raw entropy values.
The raw entropy values were used for the following analysis.
3.2 Averaged Entropy Values
In this part, we investigate averaged entropy values to see how sensitive they are to the Signal
to Noise Ratio (SNR). Figure 5 shows these values for the four types of entropies. Each result
was obtained by averaging 200 Monte Carlo trials. All the values are normalized by the mean
of the entropy value of white Gaussian noise. As the difference between ApEn, SamEn and
RenyiEn cannot be seen clearly in Fig. 5a, b shows them in detail.
Figure 5a depicts the excellent performance of BispEn. The BispEn values are more distinct than the other entropies in low SNRs. Figure 5b indicates that the SamEn has a better
performance than ApEn. Table 1 indicates that the standard deviation of SamEn is larger than
the ApEn.

123

A Comparative Study of Different Entropies

1727

Table 1 Entropy values


NR(dB)

ApEn

SamEn

RenyiEn

BispEn

2.3179 0.0011

2.1625 0.0025

6.9905 0.0010

0.7150 0.0432

2.3218 0.0010

2.1698 0.0025

6.9913 0.0010

0.7369 0.0486

2.3248 0.0007

2.1745 0.0026

6.9917 0.0011

0.7719 0.0344

2.3269 0.0006

2.1785 0.0025

6.9922 0.0011

0.7930 0.0292

2.3282 0.0006

2.1805 0.0025

6.9923 0.0010

0.8165 0.0262

2.3292 0.0006

2.1822 0.0023

6.9925 0.0010

0.8378 0.0162

2.3297 0.0005

2.1833 0.0023

6.9927 0.0011

0.8519 0.0149

2.3301 0.0005

2.1841 0.0022

6.9927 0.0010

0.8616 0.0138

2.3304 0.0005

2.1843 0.0024

6.9927 0.0010

0.8729 0.0109

2.3306 0.0005

2.1847 0.0021

6.9928 0.0010

0.8792 0.0091

10

2.3307 0.0004

2.1848 0.0025

6.9927 0.0010

0.8836 0.0069

White noise

2.3309 0.0005

2.1851 0.0024

6.9928 0.0010

0.8981 0.0041

Normalized Entropy Value

(a) 1.00
0.95
0.90
0.85
ApEn
BispEn
SamEn
RenyiEn

0.80
0.75
-10

-9

-8

-7

-6

-5

-4

-3

-2

-1

-3

-2

-1

SNR (dB)

(b)

1.000

Normalized Entropy Value

Fig. 5 Averaged entropy values

0.998
0.996
0.994
0.992
ApEn
SamEn
RenyiEn

0.990
0.988
-10

-9

-8

-7

-6

-5

-4

SNR (dB)

123

1728

W. Zhu et al.

(a)

(b) 1.005
Normalized SamEn

Normalized ApEn

1.000

0.998

0.996

0.994

0.992

-10 -9 -8 -7 -6 -5 -4 -3 -2 -1

1.000

0.995

0.990

0.985

-10 -9 -8 -7 -6 -5 -4 -3 -2 -1

SNR (dB)

(d)
Normalized BispEn

Normalized RenyiEn

(c) 1.0005

1.0000

0.9995

0.9990

-10 -9 -8 -7 -6 -5 -4 -3 -2 -1

SNR (dB)

1.0
0.9
0.8
0.7
0.6
0.5

-10 -9 -8 -7 -6 -5 -4 -3 -2 -1

SNR (dB)

SNR (dB)

Fig. 6 Boxplots of entropies. a ApEn, b SamEn, c RenyiEn, d BispEn

3.3 Boxplots
Figure 6 shows the boxplots of four types of entropies with each result characterized by 200
trials. Each entropy value, shown in the plots, is normalized to the corresponding averaged
entropy value of white Gaussian noise.
Figure 6 indicates that different entropies have different statistical distributions. As it
is depicted in Fig. 6a, b ApEn has a smaller variance than SamEn. According to Fig. 6c,
RenyiEn values of modulated signal are insensitive to the modulated signals while the SNRs
are less than 5 dB. BispEn values are below 1, which means they are smaller than the mean
value of BispEn value of white Gaussian noise, and this indicates its good performance in
low SNR.
3.4 ANOVA Results
The ANOVA tests in this part indicate whether or not the entropy measures are able to discriminate between a modulated signal in noise and pure noise. To conduct this test we have
formed two groups of data. One group contains 200 sets of entropy values which result from
white Gaussian noise. The other group is composed from 200 sets of entropy values which
were extracted from modulated signals with SNRs in a range from 16 to 5 dB.

123

A Comparative Study of Different Entropies

1729

Table 2 Entropy ANOVA test results ( p value)


SNR(dB)

ApEn

SamEn

RenyiEn

BispEn

16

0.9134

0.8558

0.3593

0.0698

15

0.1473

0.5804

0.1547

1.151 106

14

0.7281

0.8990

0.3891

2.581 1013

13

0.6116

0.7256

0.6007

2.431 1025

12

0.5629

0.2881

0.8630

1.113 1033

11

0.0576

0.4141

0.5786

7.871 1056

10

2.108 105

0.1228

0.1111

3.353 1086

1.255 1010

0.0483

0.6812

4.534 1091

3.012 1027

3.192 104

0.1537

4.726 10111

6.732 1044

4.496 106

0.0345

4.252 10127

2.454 1078

1.977 1013

0.1017

2.422 10149

8.294 10127

2.144 1029

1.150 104

8.658 10177

Table 2 presents the ANOVA test results of the entropy values. Two groups are used for
each ANOVA test: one is the entropy of noise signal from the channel and the other is the
corresponding one of the signal that contains the modulated signal. Each group contains 200
samples. The p values in the table denote whether the difference between the mean values
of the two groups are statistically significant or not. BispEn has a sharp detection at 15 dB
while the RenyiEn can show the difference at 5 dB. Same results can be obtained in Fig. 6.

4 Discussion
Entropy is a quantified measure of randomness, uncertainty and diversity within a signal.
Shannon introduced the concept of entropy into information theory [26] and since then it
became an important signal feature. Of all possible signals, white Gaussian noise has the
highest entropy value [27]. When a signal contains regular patterns its entropy decreases.
This property was utilized to detect radar signals in the presence of white noise [28]. As the
signal transmitted by primary users contains structure, its entropy value should be different from a white Gaussian noise. Therefore, entropy can be used as a feature for spectrum
sensing.
Since Shannon first came up with the idea of information entropy, the science of signal
processing has made tremendous progress. Now, there is a whole range of different entropy
measurements, such as approximate entropy, spectral entropy, state entropy and not to forget
Shannon entropy [15]. Different entropies have different properties which make them more
or less suitable for specific applications. A comparative investigation of these entropies is
helpful for us to build better spectrum sensing algorithms.
In this paper, we have conducted a comparative study which investigates the performance
of ApEn, SamEn, RenyiEn and BispEn for spectrum sensing. Table 2 gives the entropy values
extracted from Monte Carlo simulations. Figure 5 depicts the averaged entropy values. With
different SNR values, the boxplots of each entropy value are shown in Fig. 6. Table 2 presents

123

1730

W. Zhu et al.

ANOVA test results. Through all these results, we can identify which type of entropy is more
suitable for spectrum sensing.
The information we can extract from Fig. 5 and Table 2 is that BispEn is the most effective
method for spectrum sensing. Related research has shown that entropy measures extracted
from frequency domain signals outperform similar measures from time domain signals.
Therefore, Zhang et al. [8,9] developed a frequency domain detector which has a better performance than a corresponding time domain system. With the detection probability equals
to 0.9 and the false alarm rate is 0.08, the frequency domain detector has 5 dB performance
improvement than the time domain one proposed in reference [7]. BispEn is a kind of spectral entropy, so it shows the advantage in spectrum sensing. That indicates the structures
of modulation signals in frequency domain can be measured better by entropy than in time
domain.
RenyiEn has a poor performance when the SNR is low and therefore it is not fit for the
this application. As mentioned before, it was used as a feature for modulation classification
[20]. Although it can classify three types of digital modulation schemes, it is not sensitive
in spectrum sensing when the SNR is low. Therefore, it is not a good feature for spectrum
sensing.
Figure 6 presents the variance within the entropy value groups. Compared to SamEn and
BispEn, ApEn has a relative small variation, this means that it is less sensitive to the information source than the other entropies. This will help if we use a single entropy value is
calculated to do the classification. Figure 6d shows that the variance of BispEn is related to
the SNR. The relationship needs further investigation.
Shannon Entropy has been used to detect an IEEE 802.11A Orthogonal Frequency-division Multiplexing (OFDM) waveform in noise [29]. The specific waveform features are
identifiable when the SNR is 5 dB. Chen and Nagaraj investigated estimated Shannon
entropy for the entropy based detector [30]. And the entropy values cannot show the difference between H0 and H1 , defined in Eq. (5), when the SNR is as low as 0 dB. Cross entropy
based spectrum sensing technique is introduced and has a better performance over the Shannon entropy [31]. But compared to the curve we obtained in Fig. 6, the cross entropy curves
can only show a smaller difference between the value of Gaussian white noise and low SNR
modulated signal.
BispEn has outperformed the other three types of entropies which were investigated in
this paper. It can detect the modulated signal when SNR is as low as 15 dB, that means
it is suitable for spectrum sensing. Detectors for spectrum sensing which are based on this
feature will have a promising performance.

5 Conclusion
This paper presents a comparative study of four types of entropies. The results show us which
type of entropy is more suitable for spectrum sensing. The work in this paper is helpful for
building a well performed spectrum sensing algorithm. Monte Carlo methods and ANOVA
tests show that BispEn has the best performance among the four types of entropies. According to Table 2, BispEn has at least 5 dB improvement than the other three types of entropies.
This helps us to understand BispEn is more suitable for spectrum sensing.
The divergence of the performances motivates us to carry out further investigation on the
detector. Based on the results we have obtained in this paper, we can study more spectral
entropies and build spectrum sensing schemes with better performance. So far we have only

123

A Comparative Study of Different Entropies

1731

carried out a comparative study on the feature itself. As future work, we need to analyze
different classification algorithms with the promising entropy features.
Acknowledgments We are grateful for the help from U. Rajendra Acharya in designing the entropy
calculation programs. Finally, we would like to acknowledge that this work was supported by Tianjin
Science and Technology Support Key Project Plan (10ZCKFGX01500) and National Key Special Project
(2012ZX03004008).

References
1. Haykin, S. (2005). Cognitive radio: Brain-empowered wireless communications. Selected Areas in
Communications, IEEE Journal On, 23(2), 201220.
2. Sridhara, K., Chandra, A., & Tripathi, P. (2008). Spectrum challenges and solutions by cognitive
radio: An overview. Wireless Personal Communications, 45, 281291.
3. Federal Communications Commission. (2005). Notice of proposed rule making and order: Facilitating
opportunities for flexible, efficient, and reliable spectrum use employing cognitive radio technologies.
ET Docket No. 03-108
4. Gavrilovska, L., & Atanasovski, V. (2011). Spectrum sensing framework for cognitive radio networks. Wireless Personal Communications, 59, 447469.
5. Budiarjo, I., Lakshmanan, M., & Nikookar, H. (2008). Cognitive radio dynamic access techniques. Wireless Personal Communications, 45, 293324.
6. Yucek, T., & Arslan, H. (2009). A survey of spectrum sensing algorithms for cognitive radio
applications. Communications Surveys Tutorials IEEE, 11(1), 116130.
7. Nagaraj, S. V. (2009). Entropy-based spectrum sensing in cognitive radio. Signal Processing, 89(2), 174
180.
8. Zhang, Y., Zhang, Q., & Wu, S. (2010). Entropy-based robust spectrum sensing in cognitive radio.
Communications, IET, 4(4), 428436.
9. Zhang, Y. L., Zhang, Q. Y., & Melodia, T. (2010). A frequency-domain entropy-based detector for
robust spectrum sensing in cognitive radio networks. Communications Letters, IEEE, 14(6), 533535.
10. Proakis, J. G., & Salehi, M. (2008). Digital communications, 5th edn. NY: McGraw Hill.
11. Harris, R. J. (1994). ANOVA: An analysis of variance primer. Itasca: F E Peacock Pub.
12. The MathWorks Inc. (2007). MATLAB version 7.4.0 (R2007a).
13. Fishman, G. S. (1995). Monte Carlo: Concepts, algorithms, and applications. New York: Springer.
14. Pincus, S. M. (1995). Approximate entropy (ApEn) as a complexity measure. Chaos, 5(1), 110117.
15. Bein, B. (2006). Entropy. Best Practice and Research Clinical Anaesthesiology, 20(1), 101109.
16. Faust, O., Acharya, U. R., Molinari, F., Chattopadhyay, S., & Tamura, T. (2012). Linear and non-linear
analysis of cardiac health in diabetic subjects. Biomedical Signal Processing and Control, 7(3),
295302.
17. Richman, J. S., & Moorman, J. R. (2000). Physiological time-series analysis using approximate entropy
and sample entropy. American Journal of Physiology. Heart and Circulatory Physiology, 278(6), H2039
2049.
18. Al-Angari, H. M., & Sahakian, A. V. (2007). Use of sample entropy approach to study heart
rate variability in obstructive sleep apnea syndrome. Biomedical Engineering, IEEE Transactions
On, 54(10), 19001904.
19. Rnyi, A. (1960). On Measures Of Entropy And Information. In Proceedings of the 4th Berkeley
symposium on mathematics, statistics and probability, (pp. 547561).
20. Tabatabaei, T. S., Krishnan, S., & Anpalagan, A. (2010). SVM-based classification of digital modulation
signals. Systems man and cybernetics (SMC), 2010 IEEE international conference on (pp. 277280).
21. Nikias, C. L., & Raghuveer, M. R. (1987). Bispectrum estimation: A digital signal processing
framework. Proceedings of the IEEE, 75(7), 869891.
22. Mohebbi, M., & Ghassemian, H. (2012). Prediction of paroxysmal atrial fibrillation based on non-linear
analysis and spectrum and bispectrum features of the heart rate variability signal. Computer Methods
and Programs in Biomedicine, 105(1), 4049.
23. Bao, M., Zheng, C., Li, X., Yang, J., & Tian, J. (2009). Acoustical vehicle detection based on
bispectral entropy. Signal Processing Letters, IEEE, 16(5), 378381.
24. Tukey, J. W. (1977). Exploratory data analysis. New York: Addison-Wesley.

123

1732

W. Zhu et al.

25. Fisher, R. A. (1918). The correlation between relatives on the supposition of mendelian inheritance. Philosophical Transactions of the Royal Society of Edinburgh, 52, 399433.
26. Shannon, C. E. (1948). A mathematical theory of communication. Bell System Technical Journal, 27,
379423; 623656.
27. Renevey, P., & Drygajlo, A. (2001). Entropy based voice activity detection in very noisy conditions. In Proceedings of 7th European conference on speech communication and technology,
EUROSPEECH2001, (pp. 18871890).
28. Zhang, Q. T. (1989). An entropy-based receiver for the detection of random signals and its application
to radar. Signal Processing, 18(4), 387396.
29. Rehm, C. R., Temple, M. A., Raines, R. A., & Mills, R. F. (2006). Entropy-based spectral processing
on the ieee 802.11A OFDM waveform. In Military communications conference, 2006. MILCOM 2006.
IEEE, (pp. 15).
30. Chen, X., & Nagaraj, S. (2008). Entropy based spectrum sensing in cognitive radio. In Wireless
telecommunications symposium, 2008. WTS 2008, (pp. 5761).
31. Gu, J., Liu, W. Jang, S. J., & Kim, J. M. (2010). Cross entropy based spectrum sensing. In
Communication technology (ICCT), 2010 12th IEEE international conference on, (pp. 373376).

Author Biographies
Wanjing Zhu received his B.Sc. and B.Ec. degrees from University
of Electronic Science and Technology of China, Chengdu, China. He
is currently a Master student in the same university with Prof. Jianguo
Ma. He is now doing his internship as a Research Assistant under the
supervision of Dr. Oliver Faust in the Digital Signal Processing Centre, Ngee Ann Polytechnic, Singapore. His research interests are Digital
Communication, Cognitive Radio (CR), Modulation Recognition and
Digital Signal Processing techniques for them.

Jianguo Ma (M96, SM97) received the B.Sc. degree from


Lanzhou University, Lanzhou, China, in 1982, and doctoral degree
in engineering from Duisburg University, Duisburg, Germany. He
was with Technical University of Nova Scotia (TUNS), Halifax, NS,
Canada from April 1996 to September 1997 as a postdoctoral fellow.
He was with Nanyang Technological University (NTU), Singapore,
from October 1997 to November 2005 as a faculty member, where he
was also the founding director of the Center for Integrated Circuits
and Systems, NTU. From December 2005 to Oct. 2009, he was with
University of Electronic Science and Technology of China (UESTC),
Chengdu, China. He is the Technical Director for Tianjin IC Design
Center since Nov 2008 concurrently, and since October 2009 he serves
as the Dean for the School of Electronic Information Engineering in
Tianjin University. He serves as the founding director of the Center
for IC & Computing Systems of Tianjin since May 2010. His research
interests are: RFICs and RF integrated systems for wireless, RF device
characterization modeling, MMIC, RF/Microwave Circuits & Systems,
EMI in wireless, RFID & Wireless sensing network. In these areas, he has published about 269 technical
papers (120 are in SCI cited journals), six U.S. patents granted and 15 filed/granted China patents, and two
books. Dr. Ma served as the Associate Editor of IEEE Microwave and Wireless Components Letters from
January 2004 to December 2005. He was one of the founding members for IEEE Chengdu Section and
served as the External Liaison Chair and Technical Activity Chair for IEEE Chengdu Section; He founded

123

A Comparative Study of Different Entropies

1733

the IEEE EDS Chengdu Chapter. Dr. Ma is the Changjiang Professor awarded by the Ministry of Education
of China, he is also the Distinguished Young Investigator awarded by National Natural Science Foundation
of China. He is the member for IEEE University Program ad hoc Committee.

Oliver Faust In 2001, Oliver Faust completed his first degree in


Germany at a private university of applied science. Subsequently, he
received his PhD from the University of Aberdeen (Scotland UK) in
2006. After that he worked for 2 years as a postdoctoral researcher
at the same institution. Subsequently, he moved to Belgium to work
for 1 year in a private company as a hardware researcher. Currently,
he is visiting faculty at Ngee Ann Polytechnic in Singapore. There he
peruses his research interests of digital communication, nonlinear and
cognitive signal processing.

123

Вам также может понравиться