Вы находитесь на странице: 1из 13

Applied Mathematics and Computation 216 (2010) 26872699

Contents lists available at ScienceDirect

Applied Mathematics and Computation


journal homepage: www.elsevier.com/locate/amc

Chaotic harmony search algorithms


Bilal Alatas
Firat University, Faculty of Engineering, Department of Computer Engineering 23119, Elazig, Turkey

a r t i c l e
Keywords:
Harmony search
Chaos
Performance

i n f o

a b s t r a c t
Harmony Search (HS) is one of the newest and the easiest to code music inspired heuristics
for optimization problems. Like the use of chaos in adjusting note parameters such as pitch,
dynamic, rhythm, duration, tempo, instrument selection, attack time, etc. in real music and
in sound synthesis and timbre construction, this paper proposes new HS algorithms that
use chaotic maps for parameter adaptation in order to improve the convergence characteristics and to prevent the HS to get stuck on local solutions. This has been done by using of
chaotic number generators each time a random number is needed by the classical HS algorithm. Seven new chaotic HS algorithms have been proposed and different chaotic maps
have been analyzed in the benchmark functions. It has been detected that coupling emergent results in different areas, like those of HS and complex dynamics, can improve the
quality of results in some optimization problems. It has been also shown that, some of
the proposed methods have somewhat increased the solution quality, that is in some cases
they improved the global searching capability by escaping the local solutions.
2010 Elsevier Inc. All rights reserved.

1. Introduction
Classical optimization algorithms are inexible to adapt the solution procedure to an optimization problem. Generally a
given problem is modeled in such a way that a classical algorithm can handle it. This generally requires making several
assumptions and/or modications which might not be easy to validate in many situations. These modications and/or
assumptions on the original problem parameters (rounding variables, softening constraints, etc.) certainly affect the solution
quality [1]. They are insufcient if integer and/or discrete decision variables are required in optimization models [1]. Solution
strategies of classical optimization algorithms are generally depended on the type of objective and constraint functions (linear, nonlinear, etc.) and the type of variables used in the problem modeling (integer, real, etc.). Their efciency is also very
much dependent on the size of the solution space, number of variables and constraints used in the problem modeling, and
the structure of the solution space (convex, non-convex, etc.). Briey, they do not offer general solution strategies that can be
applied to problem formulations where, different type of variables, objective and constraint functions are simultaneously
required by the optimization problems [1].
Inefciency of classical optimization algorithms in solving larger scale and/or highly nonlinear problems forced researchers to nd more exible and adaptable general purpose novel algorithms. Problem and model independent heuristic optimization algorithms have been proposed by researchers to overcome the drawbacks of the classical optimization procedures.
These algorithms are efcient and exible and they can be modied and/or adapted to suit specic problem requirements.
Researches on these algorithms are still continuing all around the globe. Fig. 1 shows the classications of the heuristic
algorithms.

E-mail addresses: balatas@rat.edu.tr, bilalalatas@yahoo.com


0096-3003/$ - see front matter 2010 Elsevier Inc. All rights reserved.
doi:10.1016/j.amc.2010.03.114

2688

B. Alatas / Applied Mathematics and Computation 216 (2010) 26872699

Heuristics

Physics based

Social based

Musical based

Biology based

Single point

Multi point

Stable objective
function

Dynamic
objective
function

Single
neighborhood

Dynamic
neighborhood

Without
memory

With memory

Hybrid

Fig. 1. Heuristic algorithms.

A meta-heuristic algorithm, mimicking the improvisation process of music players, has been developed and named Harmony Search (HS) [27]. HS has several advantages with respect to traditional optimization techniques such as the following:
(a) HS imposes fewer mathematical requirements.
(b) HS is free from divergence.
(c) HS does not require initial value settings of the decision variables, thus it may escape the local optima. Furthermore it
may be easily adapted for multi-modal problems [8].
(d) As the HS uses stochastic random searches, derivative information is unnecessary. HS has novel stochastic derivative
[9].
(e) HS can handle both discrete and continuous variables.
(f) HS algorithm could overcome the drawback of genetic algorithms building block theory by considering the relationship among decision variables using its ensemble operation. HS generates a new vector, after considering all of the
existing vectors, whereas the genetic algorithm only considers the two parent vectors [10].
These features increase the exibility of the HS algorithm and produce better solutions.
HS is good at identifying the high performance regions of the solution space at a reasonable time, but gets into trouble in
performing local search for numerical applications. Researchers are still trying in order to improve the ne-tuning characteristic and convergence rate of HS algorithm [4,5,11,12].
Nonlinear dynamic systems have been iteratively used to generate chaotic sequences of numbers that are then mapped to
various note parameters (pitch, dynamic, rhythm, duration, tempo, instrument selection, attack time, etc.) in real music. Four
pioneers of these methods are Jeff Pressing, Michael Gogins, Rick Bidlack, and Jeremy Leach [13]. Chaos has also been used in
sound synthesis and timbre construction [13].
Many chaotic maps in the literature possess certainty, ergodicity, and the stochastic property. Recently, chaotic sequences
have been adopted instead of random sequences and very interesting and somewhat good results have been shown in many
applications [1416]. They have also been used together with some heuristic optimization algorithms [1719] to express
optimization variables. The choice of chaotic sequences is justied theoretically by their unpredictability, i.e., by their
spread-spectrum characteristic, non periodic, complex temporal behavior, and ergodic properties.
In this paper, sequences generated from different chaotic systems substitute random numbers for different parameters of
HS, musical inspired heuristic algorithm, where it is necessary to make a random-based choice. For this purpose, different HS
methods that use chaotic maps as efcient alternatives to pseudorandom sequences have been proposed. By this way, it is
intended to enhance the global convergence and to prevent to stick on a local solution. However, in general, it is hard to

2689

B. Alatas / Applied Mathematics and Computation 216 (2010) 26872699

estimate how good most chaotic random number generators by applying statistical tests are, as they do not follow the uniform distribution. The simulation results show that the application of deterministic chaotic signals instead of random sequences may be a possible strategy to improve the performances of HS.
The remaining of this paper is organized as follows. Review of HS is summarized in Section 2. Section 3 offers a short
introduction on improvements for HS. Section 4 describes the proposed methods, Chaotic Harmony Search Algorithms,
shortly CHSAs. Section 5 describes the benchmark problems used for comparisons of the proposed methods. In Section 6,
the testing of the proposed methods through benchmark problems are carried out and the simulation results are compared
with those obtained via other algorithms that have been reported to have good performance. Finally, the conclusion is drawn
based on the comparison analysis reported and presented in Section 7.
2. Harmony search algorithm
Harmony Search (HS) algorithm, originated by Geem et al. [2], is based on natural musical performance processes that
occur when a musician searches for a better state of harmony [27]. The resemblance, for example between jazz improvisation that seeks to nd musically pleasing harmony and the optimization is that the optimum design process seeks to nd
the optimum solution as determined by the objective function. The pitch of each musical instrument determines the aesthetic quality just as the objective function is determined by the set of values assigned to each design variable. Aesthetic
sound quality can be improved practice after practice just as objective function value can be improved iteration by iteration
[2,3].
The analogy between improvisation and optimization is shown in Fig. 2. Each musician (double bassist, guitarist, and saxophonist) has some notes in their memories and they can correspond to each decision variable x1 ; x2 ; and x3 . The range of
each music instrument (double bass = {Do, Re, Mi}; guitar = {Mi, Fa, Sol}; and saxophone = {La, Si, Do}) corresponds to each
variable value (x1 = {1.2, 2.2, 3.1}; x2 = {3.2, 2.4, 1.8}; and x3 = {1.7, 2.8, 2.3}). If the double bassist plucks the note Mi, the guitarist plucks Sol, and the saxophonist toots Do, their notes together make a new harmony (Mi, Sol, Do). They intend to improvise a new harmony by considering which note to play in their minds. If the new harmony is better than existing worst
harmony, the new harmony is kept. Likewise, the new solution vector (1.2, 3.0, 1.6) generated in optimization process is kept
if it is better than existing worst harmony in terms of objective function value. Just as the harmony quality is enhanced practice after practice, the solution quality is enhanced iteration by iteration. HS algorithm was originally developed for discrete
optimization and later expanded for continuous optimization [2,3].
According to the above algorithm concept, the HS algorithm optimization procedure of which is shown in Fig. 3 consists of
the following ve steps.
Step
Step
Step
Step
Step

1:
2:
3:
4:
5:

Problem and algorithm parameter initialization


Harmony memory initialization and evaluation
New harmony improvisation
Harmony memory update
Termination criterion check

Do, Re, Mi

Considered: Do
Chosen: Do
HM Considering

Mi, Fa, Sol

Sol

La, Si, Do

Considered: Mi
Chosen: Mi Sharp

Considered: None
Chosen: Sol

Pitch Adjusting

Random Selection

x1
1.2
2.2
3.1

x2
3.2
2.4
1.8

Considered: 1.2

Considered: 3.2

Chosen: 1.2

Chosen: 3.0

1.6

x3
1.7
2.8
2.3

Re

2.7

Considered: None
Chosen: 1.6

Fig. 2. Analogy between improvisation and optimization (adapted from [3]).

f (1.2, 3.0, 1.6)

2690

B. Alatas / Applied Mathematics and Computation 216 (2010) 26872699

Step 1

Step 2

Step 3

Initialize the problem parameters


Objective function (f(x))
Decision variable (x i)
Number of decision variables (N)
Initialize the algorithm parameters
Harmony Memory Size (HMS )
Harmony Memory Considering Rate (HMCR)
Pitch Adjustment Rate (PAR)
The Number of Improvisations (NI)
Distance bound wide (bw)

Initialize the Harmony Memory (HM) with random vectors as


many as the vectors of HMS
Evaluate HM

With probability HMCR

Select a new value for a variable from HM


With probability 1-PAR

Do nothing
With probability PAR

Choose a neighboring value


With probability 1-HMCR

Select a new value from the possible value set

New harmony vector is better than


existing harmony vectors in the HM

Step 4

Improvise a
new harmony

Yes
Update HM

No

No
Termination criteria met

Step 5

Yes

Output HM

Fig. 3. Block diagram of HS.

The pseudo-code for HS algorithm is shown in Fig. 4. Five steps of HS are described in the next ve subsections.
2.1. Problem and algorithm parameter initialization
The optimization problem is specied as follows:

Minimize f x subject to xj 2 X j 1; 2; . . . ; N;
where f x is an objective function; x is the set of each decision variable xj ; N is the number of decision variables, X j is the set
and xmax
are the lower and upper bound of the jth
of the possible range of values for each decision variable, that is xmin
j
j

B. Alatas / Applied Mathematics and Computation 216 (2010) 26872699

2691

Fig. 4. Pseudo-code of HS.

decision parameter respectively. The HS algorithm parameters are also specied in this step. These are the harmony memory
size (HMS), or the number of solution vectors in the harmony memory; harmony memory considering rate (HMCR); pitch
adjusting rate (PAR); bandwidth distance (bw); and the number of improvisations (NI), or stopping criterion. Recently the
bandwith (bw) term has been changed as fret width (fw) due to its being a more musical term [20]. However, in this paper
bw will be used. The harmony memory (HM) is a memory location where all the solution vectors (sets of decision variables)
are stored. This HM is similar to the genetic pool in the genetic algorithm [21]. Here, HMCR, PAR, and bw are used to improve
the solution vector. They are dened in Step 2.3.
2.2. Harmony memory initialization and evaluation
In Step 2, the HM matrix is lled with as many randomly generated solution vectors as the HMS. This matrix has N columns where N is the total number of decision variables and HMS rows which are selected in the rst step. This initial memory is created by assigning random values that lie inside the lower and upper bounds of the decision variable to each decision
parameter of each vector of the memory as shown in (1)



;
x0i;j xmin
rj  xmax
 xmin
j
j
j

i 1; . . . ; HMS;

j 1; . . . ; N;

where xmin
and xmax
are the lower and upper bound of the jth decision parameter respectively, and rj 2 0; 1 is an uniformly
j
j
distributed random number generated anew for each value of j. Pseudo-code of memory initialization can be shown in Fig. 5.
Thus, HM form can be shown as in Fig. 6:
Candidate solution vectors in HM shown in Fig. 6 are then analyzed and their objective function values are calculated
f xi;: ; i 1; 2; . . . ; HMS.
2.3. New harmony improvisation
In Step 3, a new harmony vector, x0i x0i;1 ; x0i;2 ; . . . ; x0i;N is generated based on three rules. They are memory consideration,
pitch adjustment, and random selection. The value of a design variable can be selected from the values stored in HM with a
probability of harmony memory considering rate (HMCR). It can be further adjusted by moving to a neighbor value of a selected value from the HM with a probability of pitch adjusting rate (PAR). Or, it can be selected randomly from the set of all
candidate values without considering the stored values in HM, with the probability of (1-HMCR). The improvisation process

2692

B. Alatas / Applied Mathematics and Computation 216 (2010) 26872699

Fig. 5. Memory initialization.

Fig. 6. HM form.

of the HS algorithm is depicted in Fig. 2. The for. . .doend do loop in Fig. 4 depicts this step of HS. Memory consideration,
pitch adjustment or random selection is applied to each variable of the new harmony vector in turn.
2.3.1. Memory consideration
In Fig. 2, the rst musician in the gure has a memory containing 3 notes, {Do, Re, Mi}. He decides to choose Do in his
memory and plays it, directly. Likewise, if the rst decision variable represents the rst musician, the value of 1.2 can be
chosen from a memory. Likewise, in the memory consideration part of the HS, the values of decision variables x0i;j for the
new vectors are chosen from any of the values in the specied HM range. The HMCR, which varies between 0 and 1, is
the rate of choosing one value from the historical values stored in the HM, while (1 - HMCR) is the rate of randomly selecting
one value from the possible range of values as shown in Fig. 7.
2.3.2. Pitch adjustment
In Fig. 2, the second musician also has a 3-notes memory, {Mi, Fa, Sol}. Differently from the rst musician, he rst chooses
the note Mi. Then, he can play its neighbor pitch such as Mi Sharp. Likewise, the value of 3.2 is chosen from the memory, and
then it can be adjusted into a neighbor value 3.0 in this step.
Likewise, in the pitch adjustment step of the algorithm, every component obtained by the memory consideration is examined to determine whether it should be pitch-adjusted. The second if-end if part of the pseudo code shown in Fig. 4 explains
this step of HS. This operation uses the PAR parameter, which is the rate of pitch adjustment and pseudo-code can be seen in
Fig. 8.
r is a random number between 0 and 1; and bw is an arbitrary distance bandwidth.
2.3.3. Random selection
In Fig. 2, the last musician has some notes in his memory, {La, Si, Do}, as well. Although this memory was used during the
past improvisations, due to his musical knowledge he can also play all possible pitches, {Do, Re, Mi, Fa, Sol, La, Si, Do+}. Thus,

Fig. 7. Memory consideration.

B. Alatas / Applied Mathematics and Computation 216 (2010) 26872699

2693

Fig. 8. Pitch adjustment.

when he decides to play a note randomly, he can choose any of these notes, Sol as shown in Fig. 2. As being in the
possible data set, 1.6 can be chosen in this step, randomly, even if it doesnt exist in the memory. After each musician has
decided what to play, the new harmony is composed of (Do, Mi#, Sol). Similarly, a new solution vector is determined as
(1.2, 3.0, 1.6).
This step of HS has been shown in else part of Memory Consideration step. The value of decision variable x0:;j is randomly
chosen within the value range X i .
2.4. Harmony memory update
If the newly generated harmony vector gives a better function value than the worst one, the new harmony vector is included in the HM and the worst harmony is excluded.
2.5. Termination criterion check
The HS algorithm is terminated when the termination criterion (e.g. maximum number of improvisations) has been met.
Otherwise, Steps 2.3 and 2.4 are repeated.
3. Improvements on convergence of HS
Some works have been performed in order to improve the convergence of the HS. Mahdavi et al. [4] proposed a new variant of the HS, called the improved harmony search (IHS). The IHS dynamically updates PAR according to (2):

PARt PARmin

PARmax  PARmin
 t;
NI

where PAR(t) is the pitch adjusting rate for iteration t, PARmin is the minimum adjusting rate, PARmax is the maximum adjusting rate, NI is the umber of iteration, and t is the iteration number.
In addition, bw is dynamically updated as (3)

0 
@

ln

bwt bwmax e

bwmin
bwmax
NI

 1
t A

where bw(t) is the bandwidth for iteration t, bwmin is the minimum bandwidth and bwmax is the maximum bandwidth.
In IHS algorithm, four new extra parameters (PARmin, PARmax, bwmin, and bwmax) should be adjusted for different problems
and it is not easy to initialize these terms. Selecting the best variables is very difcult and may be another optimization
problem.
Another work is called as global-best harmony search (GHS) and is inspired by the concept of particle swarm optimization
(PSO) [5]. It modies the pitch adjustment step of the HS such that the new harmony can mimic the best harmony in the HM.
Thus, this approach replaces the bw parameter altogether and adds a social dimension to the HS [5]. GHS has exactly the
same steps as the IHS with the exception that New Harmony Improvisation step as depicted in Fig. 9.
However, another research result has shown that this PSO feature works well only for small-sized problems [22]. According to the obtained results in this work, HS outperformed PSO-HS, GA, SA, and SA + TS for a large-scale real-world problem
with 454 variables.
In [11], a more rational PAR function using the idea of simulated annealing has been proposed which increase the robustness of algorithm and therefore leads to a highly reliable algorithm. Their PAR function is:

PARt PARmax 

PARmax  PARmin
 t:
NI

In [12], some improvements on convergence of HS have also been performed.

2694

B. Alatas / Applied Mathematics and Computation 216 (2010) 26872699

Fig. 9. New harmony improvisation step of GHS.

4. Chaotic harmonic search algorithms


In simulating complex phenomena, sampling, numerical analysis, decision making and especially heuristic optimization
needs random sequences with a long period and good uniformity [23]. Chaos is a deterministic, random-like process found in
nonlinear, dynamical system, which is non-period, non-converging and bounded. Moreover, it has a very sensitive dependence upon its initial condition and parameter [19,23]. The nature of chaos is apparently random and unpredictable and
it also possesses an element of regularity. Mathematically, chaos is randomness of a simple deterministic dynamical system
and chaotic system may be considered as sources of randomness [19,23].
A chaotic map is a discrete-time dynamical system

xk1 f xk ;

0 < xk < 1;

k 0; 1; 2; . . .

running in chaotic state. The chaotic sequence

fxk :

k 0; 1; 2; . . .g

can be used as spread-spectrum sequence as random number sequence.


Chaotic sequences have been proven easy and fast to generate and store, there is no need for storage of long sequences
[24]. Merely a few functions (chaotic maps) and few parameters (initial conditions) are needed even for very long sequences.
In addition, an enormous number of different sequences can be generated simply by changing its initial condition. Moreover,
these sequences are deterministic and reproducible.
Recently, chaotic sequences have been adopted instead of random sequences and very interesting and somewhat good
results have been shown in many applications such as secure transmission [25,26], nonlinear circuits [27], DNA computing
[28], image processing [29], and etc. The choice of chaotic sequences is justied theoretically by their unpredictability, i.e., by
their spread-spectrum characteristic and ergodic properties.
One of the drawbacks of the HS is its premature convergence, especially while handling problems with more local optima
[30,31].
The classical HS algorithm uses xed values for HMCR, PAR and bw. These values that are the key factors to affect the convergence of HS are adjusted in initialization step (Step 1) and cannot be changed during new iterations. The main drawback
of this method appears in the number of iterations to nd an optimal solution.
Small PAR values with large bw values can cause to poor performance of the algorithm and too many iterations needed to
nd optimum solution. Small bw values in nal iterations increase the ne-tuning of solution vectors by local exploitation
and in early iterations bigger bw value can increase the diversity of solution vectors for global explorations. Furthermore
large PAR values with small bw values usually cause the improvement of best solutions in nal iterations which algorithm
converged to optimal solution vector. With these considerations, IHS has been proposed [4].
However for example decreased bw value is subject to trap the algorithms into the local optima and slows the convergence speed when it is near a minimum. Furthermore, HMCR parameter and randomly initializing of HM may affect the convergence speed. In fact, however, these parameters cannot ensure the optimizations ergodicity entirely in phase space,
because they are random in classical HS. That is why; these parameters may be selected chaotically by using chaotic maps.
In this paper, sequences generated from chaotic systems substitute random numbers for the HS parameters where it is necessary to make a random-based choice. By this way, it is intended to improve the global convergence and to prevent to stick
on a local solution.
This paper provides new approaches introducing chaotic maps with ergodicity, irregularity, and the stochastic property in
HS to improve the global convergence by escaping the local solutions. The use of chaotic sequences in HS can be helpful to

B. Alatas / Applied Mathematics and Computation 216 (2010) 26872699

2695

escape more easily from local minima than can be done through the classical HS. When a random number is needed by the
classical HS algorithm, it is generated by iterating one step of the chosen chaotic map that has been started from a random
initial condition at the rst iteration of the HS. The selected chaotic maps for the experiments have been listed in following
subsections.
4.1. Used chaotic maps
The chaotic maps that generate chaotic sequences in HS steps used in the experiments are listed below:
4.1.1. Logistic map
Logistic map, whose equation is given in Eq. (6), has been brought to the attention of scientists by Sir Robert May in 1976
[32]. It appears in nonlinear dynamics of biological population evidencing chaotic behavior.

X n1 aX n 1  X n :

In this equation, X n is the nth chaotic number where n denotes the iteration number. Obviously, X n 2 0; 1 under the conditions that the initial X 0 2 0; 1 and that X 0 R f0:0; 0:25; 0:5; 0:75; 1:0g. a 4 have been used in the experiments.
4.1.2. Tent map
Tent map [33] resembles the logistic map. It generates chaotic sequences in (0, 1) assuming the following form:

X n1

X n =0:7;

X n < 0:7;

10=3X n 1  X n ; otherwise:

4.1.3. Sinusoidal iterator


Third chaotic sequence generator used in this paper is the so-called sinusoidal iterator [32] and it is represented by

X n1 ax2n sinpxn :

When a = 2.3 and X 0 0:7 it has the simplied form represented by

X n1 sinpxn :

It generates chaotic sequence in (0, 1).


4.1.4. Gauss map
The Gauss map is used for testing purpose in the literature [33] and is represented by:

X n1

0;

X n 0;

1=X n mod 1; X n 2 0; 1;
 
1
1

1=X n mod1
Xn
Xn

10
11

and bzc denotes the largest integer less than z and acts as a shift on the continued fraction representation of numbers. This
map also generates chaotic sequences in (0, 1).
4.1.5. Circle map
The Circle map [34] is represented by:

X n1 X n b  a=2p sin2pX n mod 1:

12

With a = 0.5 and b = 0.2, it generates chaotic sequence in (0, 1).


4.1.6. Sinus map
Sinus map is dened as follows:

X n1 2:3X n 2

SinpX n

13

4.1.7. Henon map


The Henon map is a nonlinear 2-dimensional map most frequently employed in testing purposes. It is represented by:

X n1 1  aX 2n  bY n ;

14

Y n1 X n ;

15

2696

B. Alatas / Applied Mathematics and Computation 216 (2010) 26872699

It is sometimes written as a 2-step recurrence relation represented by:

X n1 1  aX 2n bX n1 :

16

The suggested parameter values are a = 1.4 and b = 0.3.


4.2. Proposed chaotic harmony search algorithms
New chaotic harmony search (CHS) algorithms may be simply classied and described as follows:
4.2.1. CHS1
Initial HM is generated by iterating the selected chaotic maps until reaching to the HMS as shown in Fig. 10.
4.2.2. CHS2
In this algorithm, PAR value has not been xed in algorithm parameter initialization step of HS and it has been modied
by the selected chaotic maps as follows

PARt 1 f PARt;

0 < PARt < 1;

t 0; 1; 2; . . .

17

f() is a selected chaotic map beginning with a value taking into account the constraints.
4.2.3. CHS3
In this algorithm, bw value has not been xed in algorithm parameter initialization step of HS and it has been modied by
the selected chaotic maps as follows

bwt 1 f bwt;

0 < bwt < 1;

t 0; 1; 2; . . .

18

f() is a selected chaotic map beginning with a value taking into account the constraints.
4.2.4. CHS4
In this algorithm PAR and bw values have not been xed in HS and they have been modied by the selected chaotic maps
as follows

PARt 1 f PARt;
bwt 1 f bwt;

0 < PARt < 1;


0 < bwt < 1;

t 0; 1; 2; . . .
t 0; 1; 2; . . .

19

4.2.5. CHS5
CHS1 and CHS2 are combined, that is initial HM is generated by iterating the selected chaotic maps and PAR value has
been modied by the selected chaotic maps when needed.
4.2.6. CHS6
CHS1 and CHS2 are combined, that is initial HM is generated by iterating the selected chaotic maps and bw value has been
modied by the selected chaotic maps when needed.

Fig. 10. Pseudo-code of CHS1.

2697

B. Alatas / Applied Mathematics and Computation 216 (2010) 26872699

4.2.7. CHS7
CHS1, CHS2, and CHS3 are combined. In this approach:
 HM is generated by iterating the selected chaotic maps.
 PAR value has been modied by the selected chaotic maps.
 bw value has been modied by the selected chaotic maps.
5. Test problems
Well-dened benchmark functions which are based on mathematical functions can be used as objective functions to measure and test the performance of optimization methods. The nature, complexity and other properties of these benchmark
functions can be easily obtained from their denitions. The difculty levels of most benchmark functions are adjustable
by setting their parameters. From the standard set of benchmark problems available in the literature, two important functions which are multi-modal (containing many local optima, but only one global optimum) are considered to test the efcacy
of the proposed methods. Table 1 shows the main properties of the selected benchmark functions used in the experiments.
6. Experimental results
Selected two benchmark problems are solved by simulating the HS, IHS and GHS algorithms. Two criteria are applied to
terminate the simulation of the algorithms: reaching maximum number of iterations which is set to a constant number and
the second criterion is getting a minimum error.
All HM was initialized in regions that include the global optimum for a fair evaluation. The algorithms were run for 100
times to catch their stochastic properties. In this experiment, maximum iteration number was set to 500 and the goal is not
to nd the global optimum values but to nd out the potential of the algorithms. Algorithm success rate dened in Eq. (20)
has been used for comparison of the results obtained from different HS algorithms.

S 100


NT successful 
Q
:
NT all  lev el

20

Table 1
Properties of test problems, lb indicates lower bound, ub indicates upper bound, opt indicates optimum point.
Function no.

Function name

Denition

Griewangk

f1 x

Rastrigin

PN 

x2i
i1 4000

f2 x 10  N


PN

QN

i1 cos

2
i1 xi

 
xi
p
1

lb

ub

opt

Property

50

50

Multi-modal

5.12

5.12

Multi-modal

 10  cos2pxi

Table 2
The used parameters for HS and GHS algorithms.
HMS
HS
GHS

40
40

HMCR

PAR

bw

0.9
0.9

0.8
0.8

0.2
0.2

Table 3
The used parameters for IHS algorithm.
HMS

HMCR

PARmin

PARmax

bwmin

bwmax

40

0.9

0.10

0.99

0.10

2.0

Table 4
Success rates of HS algorithms for Rastrigin Function (N=2).
Q lev el

HS

IHS

GHS

1.e5
1.e6

44
10

51
14

47
11

2698

B. Alatas / Applied Mathematics and Computation 216 (2010) 26872699

N successful is the number of trials, which found the solution on the Q lev el in the allowable maximum iteration. N all is the number
of all trials. Q lev el is the end condition to stop the algorithm, when it converges into Q lev el tolerance.
The used parameters for HS and GHS have been shown in Table 2 while the used parameters for IHS have been shown in
Table 3. Table 4 depicts the success rates of HS algorithms for Rastrigin function for N = 2. Success rates of CHS algorithms
Table 5
Success rates of CHS algorithms using different chaotic maps for Rastrigin Function (N = 2).
Q lev el

CHS1

CHS2

CHS3

CHS4

CHS5

CHS6

CHS7

Logistic map
1.e5
1.e6

56
18

48
10

66
21

79
53

48
13

99
52

59
41

Tent map
1.e5
1.e6

49
15

58
29

54
19

52
12

54
22

55
16

84
44

48
15

53
21

87
28

52
18

52
29

99
80

54
17

45
10

47
13

55
18

55
35

47
14

55
39

60
33

Circle map
1.e5
1.e6

52
16

71
33

59
19

57
37

75
28

64
37

84
41

Sinus map
1.e5
1.e6

52
19

74
31

71
41

57
45

64
24

52
23

62
37

Henon map
1.e5
1.e6

50
19

49
14

52
31

52
45

47
12

52
26

52
30

Sinusoidal iterator
1.e5
1.e6
Gauss map
1.e5
1.e6

Table 6
Success rates of HS algorithms for Griewangk Function (N = 2).
Q lev el

HS

IHS

GHS

1.e5
1.e6

62
55

68
62

68
56

Table 7
Success rates of CHS algorithms using different chaotic maps for Griewangk Function (N = 2).
Q lev el

CHS1

CHS2

CHS3

CHS4

CHS5

CHS6

CHS7

Logistic map
1.e5
1.e6

64
62

64
62

75
67

72
64

67
64

84
74

78
68

Tent map
1.e5
1.e6

67
62

72
69

91
72

71
65

60
56

73
64

71
64

Sinusoidal Iterator
1.e5
1.e6

63
60

62
58

91
71

74
69

60
48

86
66

73
67

Gauss map
1.e5
1.e6

66
60

55
51

76
72

63
57

65
63

75
69

75
69

Circle map
1.e5
1.e6

84
84

77
67

74
58

71
66

88
78

93
85

72
63

Sinus map
1.e5
1.e6

84
59

73
63

81
63

68
62

80
70

80
68

64
62

Henon map
1.e5
1.e6

72
69

65
62

79
66

62
60

63
60

78
68

73
67

B. Alatas / Applied Mathematics and Computation 216 (2010) 26872699

2699

using different chaotic maps for Rastrigin function for N = 2 is shown in Table 5. CHS algorithms have somewhat shown better performance than other HS algorithms. The performances of CHS3, CHS4, and CHS6 algorithms are better than others.
From the result demonstrated from these tables, it can be concluded that adjusting of bw values with chaotic maps have
improved the convergence property of the HS algorithm.
Table 6 depicts the success rates of HS algorithms for the second benchmark function, Griewangk function for N = 2. The
same values selected to for Rastrigin function have also been used for this function. Success rates of CHS algorithms using
different chaotic maps for Griewangk function for N = 2 is shown in Table 7. CHS algorithms have somewhat shown better
performance than other HS algorithms as in Rastrigin function. The performances of CHS3, CHS4, and CH6 algorithms are
better than others.
7. Conclusions
In this paper, different chaotic maps have been embedded to adapt the parameters of music inspired HS algorithm. This
has been done using the chaotic number generators each time a random number is needed by the classical HS algorithm.
Seven new chaotic HS algorithms have been proposed and different chaotic maps have been analyzed in the benchmark
functions. It has been detected that coupling emergent results in different areas, like those of HS and complex dynamics,
can improve the quality of results in some optimization problems and also that chaos may be a desired process as in real
music. It has been also shown that, these methods especially CHS3, CHS4, and CHS6 algorithms have somewhat increased
the solution quality, that is in some cases they improved the global searching capability by escaping the local solutions.
These proposed methods are new and more elaborated experiments may be performed with parallel or distributed
implementation.
References
[1] A. Baykasoglu, L. Ozbakir, P. Tapkan, Articial bee colony algorithm and its application to generalized assignment problem, in: Felix T.S. Chan, Tiwari
Manoj Kumar (Eds.), Chapter 8 of Swarm Intelligence: Focus on Ant and Particle Swarm Optimization, Itech Education and Publishing, 2007. p. 532.
[2] Z.W. Geem, J.H. Kim, G.V. Loganathan, A new heuristic optimization algorithm: harmony search, Simulation 76 (2) (2001) 6068.
[3] K.S. Lee, Z.W. Geem, A new meta-heuristic algorithm for continues engineering optimization: harmony search theory and practice, Comput. Meth.
Appl. Mech. Eng. 194 (2005) 39023933.
[4] M. Mahdavi, M. Fesanghary, E. Damangir, An improved harmony search algorithm for solving optimization problems, Appl. Math. Comput. 188 (2007)
15671579.
[5] M.G.H. Orman, M. Mahdavi, Global-best harmony search, Appl. Math. Comput. 198 (2008) 643656.
[6] M. Mahdavi, M. Haghir, C.H. Abolhassani, R. Forsati, Novel meta-heuristic algorithms for clustering web documents, Appl. Math. Comput. 201 (2008)
441451.
[7] Z.W. Geem, Novel derivative of harmony search algorithm for discrete design variables, Appl. Math. Comput. 199 (1) (2008) 223230.
[8] X.Z. Gao, X. Wang, S.J. Ovaska, Uni-modal and multi-modal optimization using modied harmony search methods, Int. J. Innov. Comput. Inform.
Control 5 (10(A)) (2009) 29852996.
[9] Z.W. Geem, Global optimization using harmony search: theoretical foundations and applications, in: A. Abraham, A.E. Hassanien, P. Siarry, A.
Engelbrecht (Eds.), Foundations of Computational Intelligence, vol. 3, Springer, 2009, pp. 5773.
[10] Z.W. Geem, Improved harmony search from ensemble of music players, Lect. Notes Artif. Int. 4251 (2006) 8693.
[11] N. Taherinejad, Highly reliable harmony search algorithm, Eur. Conf. Circuit Theory Des. (2009) 818822.
[12] Z.W. Geem, W.E. Roper, Various continuous harmony search algorithms for web-based hydrologic parameter optimization, Int. J. Math. Model. Numer.
Optim. 1 (3) (2010) 213226.
[13] J.A. Maurer, The inuence of chaos on computer-generated music, <http://ccrma.stanford.edu/blackrse/chaos.html>, 1999 (accessed 04.03.2009).
[14] M. Suneel, Chaotic sequences for secure CDMA, Ramanujan Inst. Adv. Study Math. (2006) 14.
[15] H. Gao, Y. Zhang, S. Liang, D.A. Li, New chaotic algorithm for image encryption, Chaos Solitons Fractals (29) (2006) 393399.
[16] P. Arena, R. Caponetto, L. Fortuna, A. Rizzo, M. La Rosa, Self organization in non recurrent complex system, Int. J. Bifur. Chaos 10 (5) (2000) 11151125.
[17] B. Alatas, E. Akin, B. Ozer, Chaos embedded particle swarm optimization algorithms, Chaos Solitons Fractals 40 (4) (2009) 17151734.
[18] R. Caponetto, L. Fortuna, S. Fazzino, M.G. Xibilia, Chaotic sequences to improve the performance of evolutionary algorithms, IEEE Trans. Evol. Comput. 7
(3) (2003) 289304.
[19] B. Alatas, Chaotic bee colony algorithms for global numerical optimization, Expert Syst. Appl. (2010). <http://dx.doi.org/10.1016/j.eswa.2010.02.042>.
[20] Z.W. Geem, Recent Advances in Harmony Search Algorithm, Springer, Berlin, 2010.
[21] Z.W. Geem, J.H. Kim, G.V. Loganathan, Harmony search optimization: application to pipe network design, Int. J. Model. Simul. 22 (2) (2002) 125133.
[22] Z.W. Geem, Particle-swarm harmony search for water network design, Eng. Optim. 41 (4) (2009) 297311.
[23] H.G. Schuster, Deterministic Chaos: An Introduction, 2nd Revised Ed., Physick- Verlag GmnH, D-6940 Weinheim, Federal Republic of Germany, 1988.
[24] G. Heidari-Bateni, C.D. McGillem, A chaotic direct-sequence spread spectrum communication system, IEEE Trans. Commun. 42 (24) (1994) 1524
1527.
[25] K. Wong, K.P. Man, S. Li, X. Liao, More secure chaotic cryptographic scheme based on dynamic look-up table, Circuits Syst. Signal Process. 24 (5) (2005)
571584.
[26] M. Suneel, Chaotic Sequences for Secure CDMA, Ramanujan Inst. Adv. Study Math. (2006) 14.
[27] P. Arena, R. Caponetto, L. Fortuna, A. Rizzo, M. La Rosa, Self organization in non recurrent complex system, Int. J. Bifur. Chaos 10 (5) (2000) 11151125.
[28] G. Manganaro, G.J. de Pineda, DNA computing based on chaos, in: Proc. IEEE International Conference on Evolutionary Computation, IEEE Press,
Piscataway, NJ, 1997, pp. 255260.
[29] F. Han, J. Hu, X. Yu, Y. Wang, Fingerprint images encryption via multi-scroll chaotic attractors, Appl. Math. Comput. 185 (2) (2007) 931939.
[30] P. Chakraborty, G.G. Roy, S. Das, D. Jain, A. Abraham, An improved harmony search algorithm with differential mutation operator, IOS Press,
Fundamenta Informaticae, 95 (4) (2009) 401-426.
[31] S.O. Degertekin, Optimum design of steel frames via harmony search algorithm, in: Z.W. Geem (Ed.), Harmony Search Algorithms for Structural Design
Optimization, Springer, 2009, pp. 5178.
[32] R.M. May, Simple mathematical models with very complicated dynamics, Nature 261 (1976) 459.
[33] H. Peitgen, H. Jurgens, D. Saupe, Chaos and Fractals, Springer-Verlag, Berlin, Germany, 1992.
[34] W.M. Zheng, Kneading plane of the circle map, Chaos Solitons Fractals 4 (1994) 1221.

Вам также может понравиться