Вы находитесь на странице: 1из 6

2008 IEEE Conference on Soft Computing in Industrial Applications (SMCia/08), June 25-27, 2008, Muroran, JAPAN

Sharing Evolution Genetic Algorithm for Global


Numerical Optimization
Sheng-Ta Hsieh, Tsung-Ying Sun and Chan-Cheng Liu

AbstractIn this paper, a sharing evolution genetic


algorithms (SEGA) is proposed to solve global numerical
optimization problems. In the SEGA, three strategies are
proposed, which are population manager, sharing cross-over
and sharing mutation, for effective increasing new born
offsprings solution searching ability. Experiments were
conducted on CEC-05 benchmark problems which included
unimodal, multimodal, expanded, and hybrid composition
functions. The results showed that the proposed method exhibits
better performance when solving these benchmark problems
compared to recent variants of the genetic algorithms.
Index Termsnumerical optimization, population manager,
sharing crossover, sharing evolution genetic algorithm (SEGA),
sharing mutation, survival rate.

I. INTRODUCTION

S more and more real-world optimization problems


become increasingly complex, algorithms with more
capable optimizations are also increasing in demand.
Unconstrained optimization problems can be formulated as a
N-dimensional minimization problem as follows:
Min f ( x ), x = [ x1 , x2 ,..., xN ]

where N is the number of the parameters to be optimized.


The genetic algorithm (GA) is use for moving from one
population of chromosomes to a new population by
employing a principle similar to Darwins natural selection
together with the genetics-inspired operators of selection,
cross-over, mutation, and inversion. The basic principles of
GA were first introduced by John Holland in 1975[1]-[2].
Hollands GA was the first evolutionary computation (EC)
paradigm developed and applied.
The fitter chromosome is likely to be selected for
reproduction. The objective of cross-over is to randomly
choose loci and exchange the subparts of chromosomes to
create offspring. Mutation randomly flips the allele values of
some locations in the chromosome; and inversion reverses the
order of a contiguous section of the chromosome [3]. The
term chromosome typically refers to a candidate solution to a
problem. Through the genetic evolution progress, genetic
algorithms can search for solutions efficiently without
derivative information; an optimal solution will be
represented by the final winning chromosome in the genetic

978-1-4244-3782-5/08/$25.00 2008 IEEE

competition.
Many researchers are devoted to developing new
operations to enhance the optimal capacity of the original GA.
There are several strategies for cross-over operation, such as
two-point, multi-point, and uniform [4], etc., proposed for
improving the efficiency of binary-coded GA. The
extrapolation, interpolation [5], and multi-cross-over [6] were
proposed for enhancing the performance of real-coded GA.
Mutation is one of the most significant and promising areas of
investigation in evolutionary computation, since it is able to
prevent the GA from falling into the local minimal. Hong and
Wang have proposed a dynamical mutation to tune the rate
about different types of mutation [7]. Zhang et al. proposed a
mutation GA with adaptive probability, and associated
k-mean and fuzzy-based system to update that probability [8].
Leung and Wang proposed OGA/Q [9] which utilized the
characteristics of the orthogonal matrix and quantized
searching space as several subspaces for generating offspring.
A hybrid Taguchi genetic algorithm (HTGA) [10] was
proposed by Tsai et al. The HTGA can efficiently generate
better offspring and performed with better results than
OGA/Q. Recently, Liu et al. proposed a fuzzy based method
to generate offspring [11].
Different from the other methods of computational
intelligence, such as fuzzy theory, artificial neural network,
etc, GA has the ability of avoiding the local search and can
increase the probability of finding the global best. It has been
successfully applied to the fields of machine learning [2][12],
numerical optimization [13]-[15], signal processing [16]-[19],
etc.
Although numerous variants of GA have been empirically
shown to perform well on many optimization problems in the
last three decades, issues with premature convergence when
solving complex problems are still prominent in GA. In order
to efficiently drive the population and improve GAs
performance on complex multimodal problems, the Sharing
Evolution Genetic Algorithm (SEGA) is proposed. The
SEGA is including population manager and sharing strategy,
which are sharing cross-over and sharing mutation, for
enhancing the solution searching and convergence abilities.
II. GENETIC ALGORITHM
The traditional GA (TGA) had following features: (1) a bit
string representation, (2) proportional selection (3) cross-over
as the primary method to produce new individuals (4)

- 326 -

mutation for disturbing evolution to avoid solutions falling


into the local search and (5) elitism policies are used. In this
section, a brief introduction of genetic algorithm will be
described.
A. Chromosome Representation
Considering that a problem is presented as f (x1 , x2 ,K, x N )
which consists of N tunable parameters to be optimized. In
GA, it can be encoded by vector representation (i.e.,
chromosome) as C m = [x1 , x2 , K , x N ]m , m = 1,2,K p , where p

the other hand, adopting the mutation operation will extend


the searching range in order to explore unsearched area in the
searching space for finding the potential optimal solution.
E. Selection
After cross-over and mutation operations, all chromosomes,
including parents and offspring in a population, will be larger
than the initialization. In order to produce better offspring, the
elitism operation is adapted to select p better chromosomes
which will survive in next generation.

denotes the population size. For high-dimension or complex


problems, GA will require a larger population for uniform
distribution of population in the searching space; otherwise it
may be unable to explore all possible solutions. The value of
p is always given experimentally.

The GA optimization is combined with operations mentioned


above and repeats the evolution process until it reaches the
pre-defined terminated conditions.

B. Initial Population
For most optimal techniques, the final solutions are usually
restricted by the initialization. However, GA is able to
overcome this drawback with the cross-over and mutation
operation. Therefore, chromosomes can be scatter on an area
in first generation. The initial population will be used to
generate p chromosomes which will be distributed over the
searching space uniformly.

In genetic algorithm, new offspring are produced by


crossover and mutation operations. After selection, the new
population consists of selected better chromosome, which are
called parents for next generation, will keep generating new
offspring to perform the routine of solution searching. The
quality of the generated offspring will greatly affect GAs
solution exploration ability. If the new born offspring, which
usually have inferior information, are eliminated by selection,
the solution searching progress may slow down or even
standstill. Its also an important issue in GA to give offspring
generated by cross-over or mutation operation a higher
survival rate. A higher offspring survival rate means more
(superior) solutions will be found in each generation while a
lower one will decrease GAs solution searching ability.
Considering these concerns, in this paper, the sharing
evolution genetic algorithms (SEGA) is proposed. The SEGA
can be separated into three parts: population manager, sharing
cross-over and sharing mutation. In the following sections,
the detail of the three elements will be described.

C. Cross-over
The cross-over operation is to produce new chromosomes
(offspring) by choosing two random parent chromosomes,
but it doesnt guarantee that all the offspring are better than
the parent. However, after adopting exploration and
exploitation for performing cross-over will obtain good
results because the offspring will be generated around the
better parent. The number of individuals which will be joined
during cross-over is based on a pre-defined parameter rc
which is called cross-over rate. Thus, there will be
round ( p rc ) individuals (parents) joined to perform
cross-over.
D. Mutation
The mutation operator is to randomly change some subparts
of a chromosome. When GA is learning, the chromosomes
will move to the nearest optimal solution to itself, but that
may be not a global optimization. Therefore, some
disturbances to extend the searching range are quite important.
In general, the offspring of mutation Om is generated inside
the searching space randomly as
(1)
Om =
where denotes a mutation vector with random components
of uniform distribution in searching space. The number of
parents which joins mutation is based on a predefined
parameter rm which is called mutation rate. Thus, there are
round ( p rm ) individuals (parents) that will be joined to
perform mutation.
In general, the fitness of mutated offspring may be better or
worse than their parents and/or any cross-over offspring. On

III. SHARING EVOLUTION GENETIC ALGORITHM

A. Chromosomes Evolution Behavior


A population in GA consists of a number of chromosomes.
Each chromosome represents a potential solution of the
optimization task. All of the chromosomes iteratively
discover a probable solution. In the cross-over operation,
each chromosome will be generated based on its parents. This
is combining two chromosomes information which was also
generated by chromosomes in previous generations and
evaluated by the cost function; finally, the better
chromosomes are kept in the population. If a chromosome
discovers a new probable solution, its offspring will move
closer to it to explore the region more completely in the
cross-over process.
B. Population Manager
The first thing to applying GA to solving different
applications is to decide several initial parameters of GA,
such as a population size. Having more chromosomes can
extend the searching space and increase the probability of
finding the global optimal solution, but it will require more
time in each generation. The problem is that, until now, there

- 327 -

is no way to know what size of population is suitable for


solving the current problem.
Here, a population manager (PM) is introduced into GA to
enhance its searching ability. The population manager will
increase or decrease population size according to the solution
searching status, thus the population size in the proposed GA
is variable. If the chromosomes can not find a better solution
to update the global best solution (gbest) in several
consecutive generations, chromosomes may be trapped into
the local minimum during the evolving process, or need a
competent guide to lead them toward the potential area. Thus,
the information (experience) of existing chromosomes may
be too little to handle the current solution searching
procedures. To keep gbest updated to find better solutions in
the current generation, new potential chromosomes should be
added into the population and share their up-to-date
information to speed up the solution searching progress. In
order to avoid unlimited increase in chromosomes, a
boundary for population size should be predefined.
The possibility that the new chromosomes may be
eliminated instantly because it was born with a poor solution
counteracts the purpose of the PM. To prevent this situation,
the value p should set more than 1. This will ensure the new
born chromosomes will live for more than one generation. To
prevent the situation of preemptive removal of newborn
chromosomes and to secure a higher sensitivity for the PM,
the p value is set as 2 in this paper.

chromosome. Before the cross-over of ith parent, a random


number will be generated. If this number is larger than or
equal to Rsi, the new offspring will be generated fully by
cross-over. However, if this number is smaller than Rsi, the
new offspring will be generated by sharing cross-over. The
sharing cross-over will then be performed by the following
steps:
1) Randomly generate k integers d1, d2, , dk such that
1 d1 < d 2 < ... < d k N for chromosome i; and
randomly generate another k integers d1, d2, , dk such
that 1 d1 < d 2 < ... < d k N for chromosome j, where
1 k 0 .5 N .
2) Performing cross-over for corresponding dimensions of
chromosomes i and j to generate new offspring.

The SC will effectively allow experience exchange between


different dimensions of chromosomes.

(2)

D. Sharing Mutation
In general, mutation is adopted to generate new chromosomes,
mutate one or more genes, to save chromosomes that fell into
the local minimum using random process, and to explore
other potential searching spaces. It will allow more potential
solutions to be produced in the area during the solution
exploration, and find unsearched solution space. Such a
process seems reasonable and efficient. In order to improve
the efficiency of mutation, the sharing mutation (SM) is
introduced for increased efficiency while exploring solution
space. The proposed sharing mutation can be classified into
two versions: local sharing and global sharing. The activating
probability of each sharing version is fifty-fifty. The main
difference between the local sharing mutation and global
sharing mutation is dimension selection.
For the local version of sharing mutation, one of the
dimensions will be picked randomly; the mutating
chromosomes corresponding dimension will be perturbed
and restricted as this dimensions solution for all
chromosomes. This will ignore other dimensions but can fine
tune the solutions of specific dimensions one by one in the
chromosome.
The same principle applies for global version of sharing
mutation, two different dimensions, will be picked randomly;
current chromosomes corresponding dimension d1 will be
perturbed and restricted as the solution boundary of d2 of all
chromosomes. The global version of sharing mutation can
prevent solutions of a particular dimension from being
trapped in the local optimum.
Whether its the local version or the global version of
sharing mutation, they will ignore other dimensions but can
fine tune the solutions of chosen dimensions one by one in the
chromosomes.

where N denotes the dimension of problems and p is the


population size. Each chromosome has its own Rs value
ranging from 0.5 to 0.
A unique sharing probability is assigned for each

E. Sharing Evolution Genetic Algorithm


After initializing a new population and evaluating their
fitness values, new offspring will be generated by either
one-cut-point cross-over or sharing cross-over. Which
cross-over strategy is chosen is dependent on Rs. For each

C. Cross-over operations
In generic cross-over operation, each chromosome will be
generated based on its parents. This is combining two
chromosomes information which was also generated by
chromosomes in a previous generation and evaluated by the
cost function; finally, the better chromosomes are kept in the
population. If a chromosome discovers a new probable
solution, its offspring will move closer to it to explore the
region more completely in cross-over process.
In order to improve the efficiency of the cross-over, a
sharing cross-over (SC) is introduced to prevent some
dimensions of chromosomes from falling into the local
optimum.
There are two cross-over operations in SEGA. The main
cross-over operation is the one-cut-point cross-over, and the
activation of the SC is decided by sharing rate Rs. Due to
different sharing probabilities will affect the results for the
same problem if the same value of learning probability was
used for all the chromosomes in the population. The
following expression defines the sharing rate of each
chromosome:
pi
1
( N 1) exp
p 1

Rsi = 0.5
N

- 328 -

chromosome i, a random number is generated, if this number


is larger than Rs, the SEGA will perform one-cut-point
cross-over; otherwise it will perform sharing cross-over.
After that, offspring will be generated based on methods of
either local or global sharing mutation. The activating
probability of the local and global sharing mutation is
fifty-fifty. The fitness evaluation will then update to pick p
better chromosomes for surviving in the next generation. If
there is no better solution found in k consecutive generations
and the population size is not equaled to the maximum
population size max_ps, two new chromosomes will be
joined to the population in the following generation. If there
are better solutions found in k consecutive generations and
the population size is more than two chromosomes, two of the
worst chromosomes will be eliminated from the population.
Thus, the population will drive fewer chromosomes for
solution searching in following generations. Otherwise, if
there is no better solution found in k consecutive generations
and the population size is equaled to the maximum population
size; two of the worst chromosomes will be eliminated from
the population to produce room to accommodate new
potential chromosomes. In this paper, the activating threshold
k of population manager was set as 2.

4)

(6)

k =0

a = 0.5, b = 0.3, k max = 20


z = (x o) * M, x = [ x1 , x2 , ..., x N ]

where M is the linear transformation matrix, condition


number =5
5)

Schwefel Problem 2.13


N

f 5 = ( A i B i ( x)) 2 + f _ bias11 , x = [ x1 , x2 , ...x N ]


i =1

(7)

A i = (aij sin j + bij cos j )


i =1

B i ( x) = (aij sin j + bij cos j ), for i = 1...N


j =1

where A, B are two N by N matrix, aij, bij are integer


random numbers in the range [100, 100] ,
 = [1 , 2 , ..., N ] , j are random number in the range

Expended Functions
6)

Shifted Expanded Griewank plus Rosenbrock Function


Griewank Function:
N

fa =

Unimodal Functions

i =1

N
xi2
x
cos( i ) + 1
4000 i =1
j

Shifted Sphere Function


Rosenbrock Function:

f1 = zi2 + f _ bias1
i =1

(3)

N 1
i =1

Shifted Schwefel Problem 1.2 with Noise in Fitness


2
N i

f 2 = z j * (1 + 0.4 N (0,1) ) + f _ bias4
i =1 j =1



z = x o, x = [ x1 , x2 ,..., xN ]

f b = 100 xi2 xi 1

z = x o, x = [ x1 , x2 ,..., xN ]

) + (x
2

1)

f 6 = f a ( f b ( z1 , z 2 )) + f a ( f b ( z 2 , z3 )) + ...
+ f a ( f b ( z N 1 , z N )) + f a ( f b ( z N , z1 )) + f _ bias12 (8)

(4)

z = x o + 1, x = [ x1 , x2 , ..., xN ]

7)

Multimodal Functions
3)

k max

N a k cos(2b k 0.5) + f _ bias11

[ , ] .

A. Test Functions
To test the proposed SEGA and compare it to other GA
approaches, several test functions of the CEC 2005
benchmark [20] were chosen which including four groups:
unimodal, multimodal, expanded, and hybrid composition
functions. All the test functions are listed as follows:

2)

k max
f 4 = a k cos(2b k ( zi + 05))
i =1 k = 0

IV. EXPERIMENTS

1)

Shifted Rotated Weierstrass Function

Shifted Rotated Expanded Scaffer F6 Function


f = 0.5 +

Shifted Rotated Rastrigin Function

sin 2 ( x 2 + y 2 ) 0.5
1 + 0.001( x 2 + y 2 ) 2

f 3 = zi2 ( zi2 10 cos(2zi ) + 10) + f _ bias10


i =1

(5)

f 7 = f ( z1 , z2 ) + f ( z 2 , z3 ) + ...
+ f ( z N 1 , z N ) + f ( z N , z1 ) + f _ bias13

z = (x o) * M, x = [ x1 , x2 ,..., xN ]

z = (x o) * M, x = [ x1 , x2 , ..., x N ]

where M is the linear transformation matrix, conditi


on number=2.

where M is the linear transformation matrix, conditi


on number=3.
- 329 -

(9)

parameters were listed as following:


y HTGA [10]
The cross-over rate is 0.1 and the mutation rate is 0.2
y OGA/Q [9]
The cross-over rate is 0.1 and the mutation rate is 0.02
y SEGA
The cross-over rate is 0.7 and the mutation rate varies
linearly from 0.1 to 0.3
The parameters of HTGA and OGA/Q are according to their
original setting [9][10]. In experiments for solving 30-D
problems, the maximum fitness evaluations (FEs) were set as
150000. For example, to run GA for 100 generations
(including population initialization) with a population size of
50, and each generation generates 50 offspring, the total FEs
will be 5000.
Each benchmark problems were run 30 times and calculated
their mean values and standard deviation for the results. The
population size of all GA approaches were set as 20, except
for SEGA, where the initial population size is set as two, and
the boundary of maximum population size is set as 20,
because the population manager of SEGA will adjust the
population size according to current status of solution
searching. The global optimum, search range, initialization
range and function bias of each test function are presented in
Table I.

Hybrid Composition Functions


8)

Rotated Hybrid Composition Function 3 [20]


f8 is composed using ten different rotated benchmark
functions: two rotated expanded Scaffer F6 functions,
two Rastrigin functions, two expanded Griewank plus
Rosenbrock functions, two Weierstrass functions and
two Griewank functions.

9)

Rotated Hybrid Composition Function 4 [20]


f9 is composed using ten different benchmark functions:
Weierstrass function, rotated expanded Scaffer F6
function, expanded Griewank plus Rosenbrock
function, Ackley function, Rastrigin function,
Griewank function, non-continuous expanded Scaffer
F6 function, non-continuous Rastrigin function, high
conditioned Elliptic function and Sphere function with
noise in fitness.

for all functions, where o = [o1 , o2 , ..., oN ] , which is used to


shift global optimum, x* = o , f _ biasn = f n (x*) , and n is the
function number.
B. Parameter Settings and Initialization
The experiments compared three variants of GA approaches,
including the proposed SEGA, on the eighteen test functions
with 30 dimensions. The three GA approaches and their

TABLE I
GLOBAL OPTIMUM, SEARCH RANGE, INITIALIZATION RANGE AND FUNCTION BIAS OF THE TEST FUNCTIONS
f

Global Optimum

Initialization Range

Search Range

Function Bias

f1

-450

[-100, 100]N

[-100, 100]N

-450

f2

-450

[-100, 100]N

[-100, 100]N

-450

f3

-330

[-5, 5]

[-5, 5]

f4

90

[-0.5, 0.5]N

[-0.5, 0.5]N

-330
N

90

f5

-460

[-100, 100]

[-100, 100]

-460

f6

-130

[-3, 1]N

[-3, 1]N

-130

f7

-300

[-100, 100]

[-100, 100]

-300

f8

360

[-5, 5]N

[-5, 5]N

360

260

260

f9

[-5, 5]

[-5, 5]

TABLE II
RESULTS OF 30-D PROBLEMS
GAs
Functions
f1

Proposed Method

HTGA

OGA/Q

-4.49e+002 1.34e-002

-4.41e+002 4.92e+000

6.92e+004 2.26e-001

f2

1.94e+004 7.87e+003

3.15e+004 3.30e+003

6.58e+004 9.34e+003

f3

-2.73e+001 9.60e+001

2.65e+001 6.75e+001

1.95e+000 3.34e+001

f4

1.24e+002 3.42e+000

1.24e+002 2.24e+000

1.33e+002 1.48e+000

f5

2.01e+004 1.37e+004

2.87e+004 1.57e+004

9.34e+005 2.27e+005

f6

-1.29e+002 2.13e-001

-1.28e+002 3.82e-001

-9.78e+001 4.92e+000

f7

-2.87e+002 2.40e-001

-2.87e+002 2.97e-001

-2.86e+002 2.85e-001

f8

1.14e+003 3.40e+002

1.71e+003 3.04e+002

1.75e+003 8.01e+000

f9

8.17e+002 5.06e+002

9.86e+002 5.24e+002

1.49e+003 1.92e+001

- 330 -

C. Experimental Results
The results of 30 runs of the three variants of GA approaches
on the eighteen test functions with 30-D problems are
presented in Table II. The best results among the three
approaches are shown in bold. From the results, the group of
unimodal problems (functions 1and 2), SEGA showed better
results than OGA/Q and HTGA can be observed. For other
three groups, the SEGA surpasses all other algorithms on
functions 3, 5, 6, 8 and 9. The SEGA achieved very similar
results as the HTGA on function 4 and 7. The HTGA also
performed well on complex problems. The proposed method
can successfully prevent solutions from fall into the deep
local minimal which is far from the global optimum. The
SEGA perform the best in most test functions. The SEGA
exhibits significantly avoid chromosome trapping in the local
minimal and can efficiently find better solutions than other
algorithms in the same FEs.
V. CONCLUSIONS
In this paper, the SEGA has been presented to solve global
numerical optimization problems. The proposed population
manager strategy can adjust population size according to its
current solution searching state to ensure better (potential)
chromosomes will join the evolution of GA. Also, the sharing
cross-over and sharing mutation can significantly increase the
generation of potential offspring and improve chromosomes
searching abilities, to aid in the search of the global optimal
solution. It also makes GA more robust, prevents
chromosomes from falling into the local minimum, and drives
chromosomes more efficiently. Nine test functions with 30
dimensions, which selected from CEC 2005 benchmarks,
were experimented. The experiments show that the proposed
SEGA can get closer to optimal solutions, and it is more
efficient than HTGA and OGA/Q on the problems studied.

[10] J. T. Tsai, T. K. Liu, and J. H. Chou, Hybrid Taguchi-genetic algorithm


for global numerical optimization, IEEE Trans. on Evolutionary
Computation, vol. 8, issue 4, pp.365-377, 2004.
[11] H. Liu, Z. Xu and A. Abraham, Hybrid fuzzy-genetic algorithm
approach for crew grouping, in Proc. of 5th International Conference
on Intelligent Systems Design and Applications (ISDA '05), 2005,
332-337.
[12] D. E. Gokdberg, Genetic Algorithms in Search, Optimization &
Machine Learning, Addison Wesley, 1989.
[13] Z. Tu, and Y. Lu, A robust stochastic genetic algorithm (StGA) for
global numerical optimization, IEEE Trans. on Evolutionary
Computation, vol. 8, issue 5, pp.456-470, 2004.
[14] D. Jong, Adaptive system design: a genetic approach, IEEE Trans. on
System, Man and Cybernetics, vol. 10, pp. 566-574, 1980.
[15] H. Muhlenbein, M. Schomisch, and J. Born, The parallel genetic
algorithm as function optimizer, Parallel Comput, vol. 17, pp.
619-632, 1991.
[16] T. Guo, and C. Mu, Blind separation of instantaneous mixed Gaussian
sources via genetic algorithms, in Proc. of the 4th world congress on
Intelligent Control and Automation, vol. 3, pp. 1849-1852, 2002.
[17] Y. Yue, and J. Mao, Blind separation of sources based on genetic
algorithm, in Proc. of the 4th world congress on Intelligent Control
and Automation, vol. 3, pp. 2099-2103, 2002.
[18] M. Li, and J. Mao, A new algorithm of evolutional blind source
separation based on genetic algorithm, in Proc. of WCICA 5th world
congress on Intelligent Control and Automation, vol. 3, pp. 2240-2244,
2004.
[19] P. Zheng, Y. Liu, L. Tian, and Y. Cao, A blind source separation
method based on diagonalization of correlation matrices and genetic
algorithm, in Proc. of WCICA 5th world congress on Intelligent
Control and Automation, vol. 3, pp. 2127-2131, 2004.
[20] http://www.ntu.edu.sg/home/EPNSugan/index_files/CEC-05/CEC05.h
tm

REFERENCES
[1]
[2]

[3]
[4]
[5]

[6]

[7]

[8]

[9]

J. H. Holland, Adaptation in Natural and Artificial Systems, University


of Michigan Press, 1975.
L. B. Booker, D. E. Gokdberg, and J. H. Holland, Classifier Systems
and Genetic Algorithms, Technical Report, No. 8, University of
Michigan, 1978.
M. Mitchell, An Introduction to Genetic Algorithms, MIT Press, 1996.
G. Syswerda, Uniform cross-over in genetic algorithms, in Proc. of
3rd. International Conference on Genetic Algorithms, 2-9, 1989.
Z. Michalewicz, T. Logan, and S. Swaminathan, Evolutionary
operations for continuous convex parameter spaces, in proc. of the 3rd
Annual Conference on Evolutionary Programming, pp. 84-97, 1994.
W. D. Chang, Application of a novel multi-cross-over genetic
algorithm to FIR filters estimation, in proc. of The joint conference on
AI, Fuzzy System, and Gray System, pp. A066: 1-6, Taipei, Taiwan,
Dec. 2003.
T. P. Hong, and H. S. Wang, A dynamic mutation genetic algorithm,
in Proc. of IEEE International Conference on Systems, Man, and
Cybernetics, vol. 3, pp. 2000-2005, 1996.
J. Zhang, H. S. H. Chung, and B. J. Hu, Adaptive probabilities of
cross-over and mutation in genetic algorithms based on clustering
technique, in Proc. of CEC2004 congress on Evolutionary
Computation, vol. 2, pp. 2280-2287, 2004.
Y.W. Leung and Y.Wang, An orthogonal genetic algorithm with
quantization for global numerical optimization, IEEE Trans. Evol.
Comput., vol. 5, pp. 4153, Feb. 2001.

- 331 -

Вам также может понравиться