Вы находитесь на странице: 1из 5

GENETIC ALGORITHM

INTRODUCTION:
A genetic algorithm (GA) is a search heuristic that mimics the process of natural
evolution. This heuristic is routinely used to generate useful solutions to optimization and
search problems. Genetic algorithms belong to the larger class of evolutionary algorithms
(EA), which generate solutions to optimization problems using techniques inspired by
natural evolution, such as inheritance, mutation, selection, and crossover.
The basic concept of GAs is designed to simulate processes in natural system necessary
for evolution, specifically those that follow the principles first laid down by Charles
Darwin of survival of the fittest. As such they represent an intelligent exploitation of a
random search within a defined search space to solve a problem.
First pioneered by John Holland in the 60s, Genetic Algorithms has been widely studied,
experimented and applied in many fields in engineering worlds. Not only does GAs
provide an alternative methods to solving problem, it consistently outperforms other
traditional methods in most of the problems link. Many of the real world problems
involved finding optimal parameters, which might prove difficult for traditional methods
but ideal for GAs.

BRIEF OVERVIEW:
GAs were introduced as a computational analogy of adaptive systems. They are modelled
loosely on the principles of the evolution via natural selection, employing a population of
individuals that undergo selection in the presence of variation-inducing operators such as
mutation and recombination (crossover). A fitness function is used to evaluate
individuals, and reproductive success varies with fitness.
The Algorithms -
• Randomly generate an initial population M(0)
• Compute and save the fitness u(m) for each individual m in the current
population M(t)
• Define selection probabilities p(m) for each individual m in M(t) so that p(m) is
proportional to u(m)
• Generate M(t+1) by probabilistically selecting individuals from M(t) to produce
offspring via genetic operators
• Repeat step 2 until satisfying solution is obtained.

The paradigm of GAs descibed above is usually the one applied to solving most of the
problems presented to GAs. Though it might not find the best solution. more often than
not, it would come up with a partially optimal solution.
METHODOLOGY:
In a genetic algorithm, a population of strings (called chromosomes or the genotype of
the genome), which encode candidate solutions (called individuals, creatures, or
phenotypes) to an optimization problem, evolves toward better solutions. Traditionally,
solutions are represented in binary as strings of 0s and 1s, but other encodings are also
possible. The evolution usually starts from a population of randomly generated
individuals and happens in generations. In each generation, the fitness of every individual
in the population is evaluated, multiple individuals are stochastically selected from the
current population (based on their fitness), and modified (recombined and possibly
randomly mutated) to form a new population. The new population is then used in the next
iteration of the algorithm. Commonly, the algorithm terminates when either a maximum
number of generations has been produced, or a satisfactory fitness level has been reached
for the population. If the algorithm has terminated due to a maximum number of
generations, a satisfactory solution may or may not have been reached.
Genetic algorithms find application in bioinformatics, phylogenetics, computational
science, engineering, economics, chemistry, manufacturing, mathematics, physics and
other fields.
A typical genetic algorithm requires:
• a genetic representation of the solution domain,
• a fitness function to evaluate the solution domain.
A standard representation of the solution is as an array of bits. Arrays of other types and
structures can be used in essentially the same way. The main property that makes these
genetic representations convenient is that their parts are easily aligned due to their fixed
size, which facilitates simple crossover operations. Variable length representations may
also be used, but crossover implementation is more complex in this case. Tree-like
representations are explored in genetic programming and graph-form representations are
explored in evolutionary programming.
The fitness function is defined over the genetic representation and measures the quality of
the represented solution. The fitness function is always problem dependent. For instance,
in the knapsack problem one wants to maximize the total value of objects that can be put
in a knapsack of some fixed capacity. A representation of a solution might be an array of
bits, where each bit represents a different object, and the value of the bit (0 or 1)
represents whether or not the object is in the knapsack. Not every such representation is
valid, as the size of objects may exceed the capacity of the knapsack. The fitness of the
solution is the sum of values of all objects in the knapsack if the representation is valid,
or 0 otherwise. In some problems, it is hard or even impossible to define the fitness
expression; in these cases, interactive genetic algorithms are used.
Once we have the genetic representation and the fitness function defined, GA proceeds to
initialize a population of solutions randomly, then improve it through repetitive
application of mutation, crossover, inversion and selection operators.

INITIALISATION:
During each successive generation, a proportion of the existing population is selected to
breed a new generation. Individual solutions are selected through a fitness-based process,
where fitter solutions (as measured by a fitness function) are typically more likely to be
selected. Certain selection methods rate the fitness of each solution and preferentially
select the best solutions. Other methods rate only a random sample of the population, as
this process may be very time-consuming.

REPRODUCTION:
The next step is to generate a second generation population of solutions from those
selected through genetic operators: crossover (also called recombination), and/or
mutation.
For each new solution to be produced, a pair of "parent" solutions is selected for breeding
from the pool selected previously. By producing a "child" solution using the above
methods of crossover and mutation, a new solution is created which typically shares
many of the characteristics of its "parents". New parents are selected for each new child,
and the process continues until a new population of solutions of appropriate size is
generated. Although reproduction methods that are based on the use of two parents are
more "biology inspired", some research suggests more than two "parents" are better to be
used to reproduce a good quality chromosome.
These processes ultimately result in the next generation population of chromosomes that
is different from the initial generation. Generally the average fitness will have increased
by this procedure for the population, since only the best organisms from the first
generation are selected for breeding, along with a small proportion of less fit solutions,
for reasons already mentioned above.
Although Crossover and Mutation are known as the main genetic operators, it is possible
to use other operators such as regrouping, colonization-extinction, or migration in genetic
algorithms.

TERMINATION:
This generational process is repeated until a termination condition has been reached.
Common terminating conditions are:
• A solution is found that satisfies minimum criteria
• Fixed number of generations reached
• Allocated budget (computation time/money) reached
• The highest ranking solution's fitness is reaching or has reached a plateau such
that successive iterations no longer produce better results
• Manual inspection
• Combinations of the above
Simple generational genetic algorithm pseudocode
1.Choose the initial population of individuals
2.Evaluate the fitness of each individual in that population
3.Repeat on this generation until termination: (time limit, sufficient fitness achieved, etc.)
1.Select the best-fit individuals for reproduction
2.Breed new individuals through crossover and mutation operations to give birth to
offspring.
3.Evaluate the individual fitness of new individuals.

APPLICATIONS:
The advantage of the GA approach is the ease with which it can handle arbitrary kinds of
constraints and objectives; all such things can be handled as weighted components of the
fitness function, making it easy to adapt the GA scheduler to the particular requirements
of a very wide range of possible overall objectives.
GAs have been used for problem-solving and for modelling. GAs are applied to many
scientific, engineering problems, in business and entertainment, including:

• Optimization: GAs have been used in a wide variety of optimisation tasks,


including numerical optimisation, and combinatorial optimisation problems such
as traveling salesman problem (TSP), circuit design [Louis 1993] , job shop
scheduling [Goldstein 1991] and video & sound quality optimisation.

• Automatic Programming: GAs have been used to evolve computer programs for
specific tasks, and to design other computational structures, for example, cellular
automata and sorting networks.

• Machine and robot learning: GAs have been used for many machine- learning
applications, including classificationa and prediction, and protein structure
prediction. GAs have also been used to design neural networks, to evolve rules for
learning classifier systems or symbolic production systems, and to design and
control robots.

• Economic models: GAs have been used to model processes of innovation, the
development of bidding strategies, and the emergence of economic markets.

• Immune system models: GAs have been used to model various aspects of the
natural immune system, including somatic mutation during an individual's
lifetime and the discovery of multi-gene families during evolutionary time.

• Ecological models: GAs have been used to model ecological phenomena such as
biological arms races, host-parasite co-evolutions, symbiosis and resource flow in
ecologies.

• Population genetics models: GAs have been used to study questions in population
genetics, such as "under what conditions will a gene for recombination be
evolutionarily viable?"

• Interactions between evolution and learning: GAs have been used to study how
individual learning and species evolution affect one another.
• Models of social systems: GAs have been used to study evolutionary aspects of
social systems, such as the evolution of cooperation [Chughtai 1995], the
evolution of communication, and trail-following behaviour in ants.

CONCLUSION:
If the conception of a computer algorithms being based on the evolutionary of organism
is surprising, the extensiveness with which this algorithms is applied in so many areas is
no less than astounishing. These applications, be they commercial, educational and
scientific, are increasingly dependent on this algorithms, the Genetic Algorithms. Its
usefulness and gracefulness of solving problems has made it the a more favourite choice
among the traditional methods, namely gradient search, random search and others. GAs
are very helpful when the developer does not have precise domain expertise, because
GAs possess the ability to explore and learn from their domain.
In this report, we have placed more emphasis in explaining the use of GAs in many areas
of engineering and commerce. We believe that, through working out these interesting
examples, one could grasp the idea of GAs with greater ease. We have also discuss the
uncertainties about whether computer generated life could exist as real life form. The
discussion is far from conclusive and ,whether artificial life will become real life, will
remain to be seen.
In future, we would witness some developments of variants of GAs to tailor for some
very specific tasks. This might defy the very principle of GAs that it is ignorant of the
problem domain when used to solve problem. But we would realize that this practice
could make GAs even more powerful.

Вам также может понравиться