Вы находитесь на странице: 1из 10

Advanced Optimization Techniques (M.

Tech)

UNIT 5
Genetic Algorithms (GA): Differences and similarities between conventional and evolutionary algorithms, working principle, reproduction, cross-over, mutation, termination criteria, different reproduction and cross-over operators, GA for constrained optimization, Draw-backs of GAs. INTRODUCTION The introduction of computers has been the most revolutionary development in the history of science and technology. This ongoing revolution is profoundly increasing our ability to predict and control nature in many ways. For many, the greatest achievement of this revolution will be the creation in the form of computer programmes of new species of intelligent beings and even new forms of life. The earliest computer scientists Alan Turing, John von Newmann, Norbert Weiner, and others- were as much interested in biology and psychology as in electronics, and they looked to natural systems as guidance to achieve their visions. From the earliest days, computers were applied not only to calculating missile trajectories and decoding military codes but also to modeling the brain, mimicking human learning, and simulating biological evolution. These biologically motivated computed activities have grown into three areas: (i) neural networks; (ii) machine learning and (iii) evolutionary computation. Genetic Algorithms are the most prominent example of evolutionary computation. GENETIC ALGORITHMS A genetic algorithm (GA) is a search heuristic that mimics the process of natural evolution. This heuristic is routinely used to generate useful solutions to optimization and search problems. Genetic algorithms belong to the larger class of evolutionary algorithms (EA), which generate solutions to optimization problems using techniques inspired by natural evolution, such as inheritance, mutation, selection, and crossover. The Genetic Algorithms were invented by John Holland in the 1960s and were developed by Holland and his students and colleagues at the University of Michigan in the 1960s and 1970s (i) to understand the adaptive processes of natural systems and (ii) to design artificial systems software that retains the robustness of natural systems. The GAs provide efficient, effective techniques for discrete optimization and machine learning applications and are widely-used today in business, scientific and engineering circles. The applications of Genetic Algorithms include aircraft design, communication networks, manufacturing, facility scheduling, resource allocation, design of gas pipeline etc.

Advanced Optimization Techniques (M.Tech)

BIOLOGICAL BACKGROUND OF GENETIC ALGORITHMS All living organisms consist of cells. In each cell there is the same set of chromosomes (strings of DNA) that serve as a blue-print for the whole organism. A chromosome consists of genes, blocks of DA. Each gene encodes a trait, such as eye colour. Possible settings for a trait (e.g. blue, brown) are called alleles. Each gene at a particular position in the chromosome. This position is called locus. Complete set of genetic material (all chromosomes) is called genome. Particular set of genes in genome is called genotype. The genotype gives raise, under later development, to the organism's phenotype- its physical and mental characteristics, such as eye color, height, brain size, intelligence etc. During reproduction, recombination (or cross over) occurs: in each parent, genes are exchanged between each pair of chromosomes to form a gamete (a single chromosome). Then gametes from the two parents pair up to create a full set of chromosomes. The children are subject to mutation that is the elements of DNA are a bit changed. These changes are mainly caused by errors in copying genes from parents. The fitness of an organism is measured by success of the organism in its life. In Genetic Algorithms, the term chromosome typically refers to a candidate solution to a problem, often encoded as a bit string. Each locus in the chromosome has two possible alleles: 0 and 1.

GENETIC ALGORITHM PROCEDURE Step 1: (i) Choose a coding (usually an l-bit string) to represent problem parameters; a selection operator; a cross over operator; and a mutation operator. (ii) Choose population size n, cross over probability pc, and mutation probability pm. (iii) Initialize a random population of strings of size l. (iv) Choose a maximum allowable generation number tmax. (v) Set t = 0
.

Step 2: Evaluate each string in the population.

Step 3: If t > tmax or other termination criterion is satisfied, Terminate.


2

Advanced Optimization Techniques (M.Tech)

Step 4: Perform reproduction on the population. Step 5: Perform cross over on random pairs of strings. Step 6: Perform mutation on every string. Step 7: Evaluate strings in the new population. Set t = t + 1 and go to Step 3.

GENETIC ALGORITHM SUMMARY TABLE

The above procedure is explained in a detailed fashion below: Initialization Initially many individual solutions (n l-bit chromosomes) are randomly generated to form an initial population. The population size, n, depends on the nature of the problem, but typically contains several hundreds or thousands of possible solutions. Traditionally, the population is generated randomly, covering the entire range of possible solutions (the search space).

Selection During each successive generation, a proportion of the existing population is selected to breed a new generation. Individual solutions are selected through a fitness-based process, where fitter solutions (as measured by a fitness function) are typically more likely to be selected.
3

Advanced Optimization Techniques (M.Tech)

All strings contained in the population may not be equally good in terms of their fitness values.

Reproduction In this step, an operator named reproduction is used to select the fit ones from the population of strings based on their fitness information. Several reproduction schemes have been developed by various investigators.

(i) Proportionate Selection / Roulette wheel Selection Imagine a roulette-wheel with its circumference marked for each string marked proportionate to its fitness. Since the circumference of the wheel is marked according to a strings fitness, the roulette wheel is expected to make (fi / f ) copies of the string in the mating pool. The roulette wheel is spun n times, each time selecting the string chosen by the roulette wheel pointer.

To simulate the roulette wheel mechanism, pi = (fi/f) is calculated for each string. Cumulative pi is calculated for each string. In order to choose n strings, n random numbers between zero and one are generated. Then the string that represents the generated random number in the cumulative probability range (calculated from the fitness value) for the string is copied to the mating pool. This way, the string with the higher fitness value will represent a larger range in the cumulative probability values and therefore has a higher probability of being copied into the mating pool.

Advanced Optimization Techniques (M.Tech)

(ii) Ranking Selection This method is similar to the roulette wheel selection method. Here, the strings are ranked according to their fitness. The string with the least fitness is ranked 1; the string with the next higher fitness is ranked 2 and so on. The rank for a string is denoted by ri. The percentage circumference of the roulette wheel to be occupied by each string is given by ri / r. To simulate the roulette wheel mechanism, ri / r is calculated for each string. Cumulative ri / r is calculated for each string. In order to choose n strings, n random numbers between zero and one are generated. Then the string that represents the generated random number in the cumulative probability range (calculated from the fitness value) for the string is copied to the mating pool. This way, the string with the higher fitness value will represent a larger range in the cumulative probability values and therefore has a higher probability of being copied into the mating pool.

(iii) Tournament Selection


We pick n strings at random from the population and determine the best one in terms of fitness value. The best string is copied into the mating pool and then all n strings are returned to the population. Thus in the scheme, only one string is selected per tournament and n tournaments are to be played to make the size of the pool equal to n. Here, there is a chance for a good string to be copied in the mating pool more than once.

(iv) Elitism Elitism is name of method, which first copies the best chromosome (or a few best chromosomes) to new population. The rest is done in classical way. Elitism can very rapidly increase performance of GA, because it prevents losing the best found solution.

The next step is to generate a second generation population of solutions from those selected through genetic operators: crossover and/or mutation. For each new solution to be produced, a pair of "parent" solutions is selected for breeding from the pool selected previously.

Advanced Optimization Techniques (M.Tech)

By producing a "child" solution using the above methods of crossover and mutation, a new solution is created which typically shares many of the characteristics of its "parents". New parents are selected for each new child, and the process continues until a new population of solutions of appropriate size is generated These processes ultimately result in the next generation population of chromosomes that is different from the initial generation.

Crossover Crossover is a genetic operator that combines (mates) two chromosomes (parents) to produce a new chromosome (offspring). The idea behind crossover is that the new chromosome may be better than both of the parents if it takes the best characteristics from each of the parents. Crossover occurs during evolution according to a user-definable crossover probability. One Point A crossover operator that randomly selects a crossover point within a chromosome then interchanges the two parent chromosomes at this point to produce two new offspring. Consider the following 2 parents which have been selected for crossover. The | symbol indicates the randomly chosen crossover point. Parent 1: 11001|010 Parent 2: 00100|111 After interchanging the parent chromosomes at the crossover point, the following offspring are produced: Offspring1: 11001|111 Offspring2: 00100|010 Two Point A crossover operator that randomly selects two crossover points within a chromosome then interchanges the two parent chromosomes between these points to produce two new offspring. Consider the following 2 parents which have been selected for crossover. The | symbols indicate the randomly chosen crossover points. Parent 1: 110|010|10 Parent 2: 001|001|11 After interchanging the parent chromosomes between the crossover points, the following offspring are produced: Offspring1: 110|001|10 Offspring2: 001|010|11
6

Advanced Optimization Techniques (M.Tech)

Uniform A crossover operator that decides (with some probability known as the mixing ratio) which parent will contribute each of the gene values in the offspring chromosomes. This allows the parent chromosomes to be mixed at the gene level rather than the segment level (as with one and two point crossover). For some problems, this additional flexibility outweighs the disadvantage of destroying building blocks. Consider the following 2 parents which have been selected for crossover: Parent 1: 11001010 Parent 2: 00100111 If the mixing ratio is 0.5, approximately half of the genes in the offspring will come from parent 1 and the other half will come from parent 2. Below is a possible set of offspring after uniform crossover: Offspring1: Offspring2: Note: The subscripts indicate which parent the gene came from. Mutation Mutation is a genetic operator that alters one or more gene values in a chromosome from its initial state. This can result in entirely new gene values being added to the gene pool. With these new gene values, the genetic algorithm may be able to arrive at better solution than was previously possible. Mutation is an important part of the genetic search as help helps to prevent the population from stagnating at any local optima. Mutation occurs during evolution according to a user-definable mutation probability. This probability should usually be set fairly low (0.01 is a good first choice). If it is set to high, the search will turn into a primitive random search. Flip Bit -A mutation operator that simply inverts the value of the chosen gene (0 goes to 1 and 1 goes to 0). This mutation operator can only be used for binary genes. Non-Uniform - A mutation operator that increases the probability that the amount of the mutation will be close to 0 as the generation number increases. This mutation operator keeps the population from stagnating in the early stages of the evolution then allows the genetic algorithm to fine tune the solution in the later stages of evolution. This mutation operator can only be used for integer and float genes. Uniform - A mutation operator that replaces the value of the chosen gene with a uniform random value selected between the user-specified upper and lower bounds for that gene. This mutation operator can only be used for integer and float genes.

Advanced Optimization Techniques (M.Tech)

Termination This generational process is repeated until a termination condition has been reached. Common terminating conditions are: (i) A solution is found that satisfies minimum criteria (ii) Fixed number of generations reached (iii) Allocated budget (computation time/money) reached (iv) The highest ranking solution's fitness is reaching or has reached a plateau such that successive iterations no longer produce better results

AN EXAMPLE SELECTION

CROSSOVER

Advanced Optimization Techniques (M.Tech)

MUTATION

Differences between Genetic Algorithms and Conventional Optimization Algorithms (1) Genetic Algorithms work with a coding of the parameter set, not the parameters themselves. (2) Genetic Algorithms search from a population of points, not a single point. (3) Genetic Algorithms use payoff (objective function) information, not derivatives and other auxiliary information. (4) Genetic Algorithms use probabilistic transition rules, not deterministic rules. Similarities between Genetic Algorithms and Conventional Optimization Algorithms (1) In the cross over operator (which is mainly responsible for the Genetic Algorithm search), two points are used to create two new points. This is similar to a gradient-based method, where the search direction requires derivative information which is usually calculated using function values at two neighbouring points. (2) Practically no statistically significant differences existed between the values of these methods. Drawbacks of Genetic Algorithms (1) A drawback of any evolutionary algorithm is that a solution is "better" only in comparison to other, presently known solutions; such an algorithm actually has no concept of an "optimal solution," or any way to test whether a solution is optimal. (For this reason, evolutionary algorithms are best employed on problems where it is difficult or impossible to test for optimality.) This also means that an evolutionary algorithm never knows for certain
9

Advanced Optimization Techniques (M.Tech)

when to stop, aside from the length of time, or the number of iterations or candidate solutions, that you wish to allow it to explore. (2) While the great advantage of GAs is the fact that they find a solution through evolution, this is also the biggest disadvantage. Evolution is inductive; in nature life does not evolve towards a good solution - it evolves away from bad circumstances. This can cause a species to evolve into an evolutionary dead end. (3) They show a very fast initial convergence, followed by progressive slower improvements. The fitness of all the models may be similar, so convergence is slow. (Sometimes it is good to combine it with a local optimization method). (4) In presence of lots of noise, convergence is difficult and the local optimization technique might be useless. (5) Models with many parameters are computationally expensive. (6) Sometimes not particularly good models are better than the rest of the population and cause premature convergence to local minima. (7) For specific optimization problems and problem instances, other optimization algorithms may find better solutions than genetic algorithms (given the same amount of computation time). Alternative and complementary algorithms include evolution strategies, evolutionary programming, simulated annealing and methods based on integer linear programming. The question of which, if any, problems are suited to genetic algorithms (in the sense that such algorithms are better than others) is open and controversial.

10

Вам также может понравиться