Вы находитесь на странице: 1из 6

Web Site: www.ijettcs.org Email: editor@ijettcs.org, editorijettcs@gmail.

com Volume 1, Issue 3, September October 2012 ISSN 2278-6856

Multiobjective Optimization Using Genetic Algorithm


Md. Saddam Hossain Mukta1, T.M. Rezwanul Islam2 and Sadat Maruf Hasnayen3
1,2,3

Department of Computer Science and Information Technology, Islamic University of Technology, Board Bazaar, Gazipur, Bangladesh

Abstract: In case of Multi-objective optimization problems


(MOP), objective vector can be scalarized into a single objective and the yielded objective is highly sensitive to the objective weight vectors and it asks the user to have knowledge about the underlying problems. Moreover in the case of Multi-objective optimization problems, one may require a set of Pareto-Optimal points in the search space, instead of a single point. Since Genetic Algorithm (GA) works with a set of individual solutions called population, it is natural to adopt GA schemes for Multi-objective Optimization problems so that one can capture a number of solutions simultaneously. Although many techniques have been developed to solve these types of problems, namely VEGA, MOGA, NPGA, NSGA etc, all of them have some shortcomings. This project proposal explains a new approach to solve these types of problems by subdividing the population with respect to each overlapping pair of objective functions and their merging through genetic operations.

2. OBJECTIVE
Most optimization problems naturally have several objectives to be achieved (normally conflicting with each other). These problems with several objectives, are called Multi-objective or vector optimization problems, and were originally studied in the context of economics and operation research. However scientists and engineers soon realized that such problems naturally arise in all areas of knowledge. Over the years, the work of a considerable amount of operational researcher has produced a important number of techniques to deal with Multi-objective optimization problems (Miettinen, 1998). However, it was until relatively recent that researchers realize the potential of evolutionary algorithms (EA) in this area. The most recent developments of such schemes are VEGA, MOGA, NPGA, NSGA and NSGA-II. The fact is that most of them are successful to many test suites for Evolutionary Multi Objective Optimization (EMOO). However they also encounter with some difficulties and recent research trends are mainly heading for devising new approach to handle with Pareto-Optimal Solutions. This research proposal mainly concentrates on a new approach to handle this concern. 2.1 Multi objective Optimization Problem Most optimization problems naturally have several objectives to be achieved and normally they conflict with each other. These problems with several objectives are called multi objective or vector optimization problems. Over the years, the work of considerable amount of operational researchers has produced an important number of techniques to deal with multi objective optimization problems ( Miettinen, 1998). We are interested in solving multi objective optimization (MOPs) of the form: Opt [ f1 (x), f2(x), ...... , fk(x) ]T Subject to the m inequality constraint: gi(x) And the p equality constraints: hi(x)=0 i = 1,2,p Page 255

Keywords: Computation.

Genetic

algorithm,

Evolutionary

1. INTRODUCTION
Most optimization problems naturally have several objectives to be achieved (normally conflicting with each other). These problems with several objectives, are called Multi-objective or vector optimization problems, and were originally studied in the context of economics and operation research. However scientists and engineers soon realized that such problems naturally arise in all areas of knowledge. Over the years, the work of a considerable amount of operational researcher has produced a important number of techniques to deal with Multi-objective optimization problems (Miettinen, 1998). However, it was until relatively recent that researchers realize the potential of evolutionary algorithms (EA) in this area. The most recent developments of such schemes are VEGA, MOGA, NPGA, NSGA and NSGA-II. The fact is that most of them are successful to many test suites for Evolutionary Multi Objective Optimization (EMOO). However they also encounter with some difficulties and recent research trends are mainly heading for devising new approach to handle with Pareto-Optimal Solutions. This research proposal mainly concentrates on a new approach to handle this concern. Volume 1, Issue 3, September October 2012

Web Site: www.ijettcs.org Email: editor@ijettcs.org, editorijettcs@gmail.com Volume 1, Issue 3, September October 2012 ISSN 2278-6856
conditional heteroskedastic (GARCH) model, in which the log-likelihood function is the objective to maximize. In each case, the unknowns may be thought of as a parameter vector, V, and the objective function, z = f(V), as a transformation of a vector-valued input to a scalarvalued performance metric z. Optimization may take the form of a minimization or maximization procedure. Throughout this article, optimization will refer to maximization without loss of generality, because maximizing f(V) is the same as minimizing -f(V). My preference for maximization is simply intuitive: Genetic algorithms are based on evolutionary processes and Darwin's concept of natural selection. In a GA context, the objective function is usually referred to as a fitness function, and the phrase survival of the fittest implies a maximization procedure. 2.3 Sharing on MOO Most experimental MOEAs incorporate phenotypic-based sharing using the distance between objective vectors for consistency. A sharing function[2] determines the degradation of an individuals fitness due to a neighbor at some distance dist. A sharing function 'sh' was defined as a function of the distance with the following properties: 0 <= sh(dist) <= 1, for all distance dist sh(0) = 1, and limdist- = 0; there are many sharing functions which satisfy the above condition. One approach can be, 1-(dist/ sh) sh(dist) = 0
sh sh

Where k is the number of objective functions fi: Rn R. We call x=[x1,x2,,xn]T the vector of decision variables. We wish to determine from among the set F of all numbers which satisfy (1.2) and (1.3) the particular set x1*, x2 *,.,xn* which yields the optimum values of all objective functions. 2.2 Genetic Algorithm The past decade has witnessed a flurry of interest within the financial industry regarding artificial intelligence technologies, including neural networks, fuzzy systems, and genetic algorithms. In many ways, genetic algorithms, and the extension of genetic programming, offer an outstanding combination of flexibility, robustness, and simplicity. "Genetic algorithms are based on a biological metaphor: They view learning as a competition among a population of evolving candidate problem solutions. A 'fitness' function evaluates each solution to decide whether it will contribute to the next generation of solutions. Then, through operations analogous to gene transfer in sexual reproduction, the algorithm creates a new population of candidate solutions." Genetic algorithms are created when computers evaluate and improve a population of possible solutions to a problem in a stepwise fashion. The new program evolves by letting good solutions produce offspring as bad solutions die out. Over time, the individual solutions in the population become better and better, producing a final, best solution. The method uses terms derived from biology, such as generation, inheritance and mutation, to describe the particular program manipulation the computer uses at each step of improvement, hence the name genetic algorithm. The genetic algorithm is a probabilistic search algorithm that iteratively transforms a set (called a population) of mathematical objects (typically fixed-length binary character strings), each with an associated fitness value, into a new population of offspring objects using the Darwinian principle of natural selection and using operations that are patterned after naturally occurring genetic operations, such as crossover (sexual recombination) and mutation. Virtually every technical discipline, from science and engineering to finance and economics, frequently encounters problems of optimization. Although optimization techniques abound, such techniques often involve identifying, in some fashion, the values of a sequence of explanatory parameters associated with the best performance of an underlying dependent, scalarvalued objective function. For example, in simple 3-D space, this amounts to finding the (x,y) point associated with the optimal z value above or below the x-y plane, where the scalar-valued z is a surface identified by the objective function f(x,y). Or it may involve estimating a large number of parameters of a more elaborate econometric model. For example, we might wish to estimate the coefficients of a generalized auto-regressive Volume 1, Issue 3, September October 2012

,otherwise

sh and of an individual x is given by:

eval'(x) = eval(x)/m(x), where m(x) returns the niche count for a particular individual x: m(x) =
y

sh(dist(x,y)).

In the above formula the sum over all y in the population includes the string x itself ; consequently, if string x is all by itself in its own niche, it fitness value does not decrease(m(x)=1). Otherwise , the fitness function is decreased proportionally to the number and closeness of neighboring points. It means, that when many individuals are in the same neighborhood they contribute to ones anothers share count, thus derating one anothers fitness values. As a result this techniques limits the uncontrolled growth of particular species within a population. Sharing occurs only if both solutions are dominated or non dominated with respect to the comparison set. A value is used , however, the associated niche count is simply the number of vectors within in phenotypic space rather than a degradation value applied against unshared Page 256

Web Site: www.ijettcs.org Email: editor@ijettcs.org, editorijettcs@gmail.com Volume 1, Issue 3, September October 2012 ISSN 2278-6856
fitness. The solution with the smaller niche count is selected for inclusion in the next generation. Represents how close two individuals must be in order to decrease each others fitness. This value commonly depends on the number of optima in the search space. As this number is generally unknown and because P Ftrues shape within objective space is also unknown , shares value is assigned using Fonsecas suggested method(Fonseca and Fleming, 1998a):
k i=1

been proved well suited for some problems but it still suffers from middling effect. MOGA Fonseca and Fleming (1993) proposed the Multi-objective Genetic Algorithm (MOGA). This approach consists of a scheme in which the rank of a certain individual corresponds to the number of individuals in the current population by which it is dominated. All non dominated individuals are assigned rank 1, while dominated ones are penalized according to the population density of the corresponding region of the trade-off surface. Its main weakness is its dependence on the sharing factor (how to maintain the diversity is the main issue when dealing with Evolutionary Multi-objective Optimization). NPGA Horn et. al. (1994) proposed the Niched Pareto Genetic Algorithm, which uses a tournament selection scheme based on Pareto dominance. Two individuals are compared against a set of members of the population (typically 10% of the population size). When both competitors are either dominated or non dominated (i.e. whether there is a tie), the result of the tournament is decided through fitness sharing in the objective domain (a technique called equivalent class sharing was used in this case) (Horn et. al., 1994). However its main weakness is that besides requiring a sharing factor, this approach also requires an additional parameter: the size of the tournament. NSGA The Non-dominated Sorting Genetic Algorithm (NSGA) was proposed by Srinivas and Deb (1994), and is based on several layers of classifications of the individuals. Before selection is performed, the population is ranked on the basis of domination (using Pareto ranking): all nondominated individuals are classified into one category (with a dummy fitness value, which proportional to the population size). To maintain the diversity of the population, these classified individuals are shared (in decision variable space) with their dummy fitness values. Then this group of classified individuals is removed from the population and another layer of non-dominated individuals is considered (i.e. the reminder of the subpopulation is re-classified). The process continues until all individuals in the population are classified. Since individuals in the first front have the maximum fitness value, they always get more copies than the rest of the population. However some researchers have reported that NSGA has lower overall performance than MOGA, and it seems to be also more sensitive to the value of the sharing factor than MOGA (Coello, 1996; Veldhuizen, 1999). However another approach of NSGA, NSGA-II is also proposed by Deb et.al. It is more efficient than NSGA. Recent Approaches: Recently, several new Evolutionary Multi objective Optimization approaches have been developed, namely PAES and SPEA. The Pareto Archived Evolution Page 257

share

)-

k i=1

N =
k s hare

Where N is the number of individuals in the populations, s is the difference between the maximum and the minimum objective values in dimension I, and k is the number if distinct MOP objectives . As all variables but one are known, can be easily computed. For example , if k=2, 1= 2 =1 and N=50, the above equation simplifies to:
share

= ( 1+ 2)/N-1= 0.041

2.4 Pareto Optimality We normally look for trade-offs, rather than single solutions when dealing with multi objective optimization problems. The notion of optimum is therefore, different. In the multi-objective optimization the notion of optimality is to interrelate the relative values of the different criteria- if we want compare apple with orangethen we must come up with a different definition of optimality. The most commonly adopted notion is that originally was proposed by Vilferdo Pareto and we will use the term: Pareto optimum.

fi(x) fi(x*) for all i = 1,..,k and fj(x) at least one j.

(x*) for

3. REVIEW OF MOO APPROACHES


VEGA David Schaffer (1985) proposed an approach called as Vector Evaluated Genetic Algorithm (VEGA), and that differed of the simple genetic algorithm (GA) only in the way in which the selection was performed. This operator was modified so that at each generation a number of subpopulations were generated by performing proportional selection according to each objective function in turn. Thus a problem with k objectives and a population with size of M, k subpopulations of size M/k each would be generated. These subpopulations would be shuffled together to obtain a new population of size M, on which GA would apply the crossover and mutation operators in the usual way. However this approach had Volume 1, Issue 3, September October 2012

Web Site: www.ijettcs.org Email: editor@ijettcs.org, editorijettcs@gmail.com Volume 1, Issue 3, September October 2012 ISSN 2278-6856
Strategy (PAES) was introduced y Knowles and Corne (2000a). This approach is very simple: it uses a (1+1) evolution strategy (i.e. a single parent that generates a single offspring) together with a historical archive that records all the non-dominated solutions previously found (such archive is used as a comparison set in way analogous to the tournament competition in NPGA). The Strength Pareto Evolutionary Algorithm (SPEA) was introduced by Ziztler and Thiele (1999). This approach was conceived as a way of integrating different Evolutionary Multi objective Optimization techniques.

4. PROPOSED APPROACH
Figure 1: 4.1 Proposed New Approach The main idea behind the proposed approach is taken from VEGA and NSGA. In the case of VEGA, first the initial population of size M is divided into k subpopulations (each of size M/k), and each subpopulation is based on k separate objective performance where total number of objective function is k. In our approach, the population is divided in to M/k 1 subpopulations where M and k stands for same as VEGA. Suppose the objective functions are, f1, f2, f3 fk and the first subpopulation will be created with respect to the performance of f1 and f2, the second will be created with respect to f2 and f3, in the same way k 1 th subpopulation will be created from fk 1 and fk. Then every subpopulation will be ranked and their fitness will be shared (analogous notion to NSGA) to ensure the maintenance of population diversity and non-dominated individuals. Now let we enumerate the subpopulations as s11, s12, s13 s1k-1 and each of size M/k 1. Now in next step, we will create k 2 subpopulations from s11, s12, s13 s1k-1. 1st subpopulation (enumerated s21) is derived from elite members (non-dominated solutions with respect to f1, f2 and f2, f3 pairs) of subpopulation s11 and s12. We will take two individuals (elite member) from s11 and s12 respectively and apply crossover. The procedure will be iterated until M/k 2 numbers of individuals fills up the s21 subpopulation. In the same way, rest of the s22, s23 s2k-2 subpopulation will be created. Now in next step, k 3 subpopulations will be generated (each of size M/k 3). At every step fitness will be shared among the individuals in every subpopulation and non dominated one will get the relatively high fitness. It will become evident that this iteration (ranking, fitness sharing, crossover and merging) will stop when they all merge to a population of size M and this iteration will continue up to k 1 times if the total number of objective is k. The overall process will be apparent form below figure (fig : 1) 4.2 Underlying Philosophy In this new approach, at the first step, we select a subpopulation that thrives with respect to f1 and f2; the next subpopulation will perform best for f2 and f3. If we take the elite individuals from these two subpopulations, and apply crossover, it will be natural that the offspring may achieve good performance with respect to f1, f2 and f3. After the overall iteration, the newly generated population (with size M) may have good performance with all k objectives. After ranking and fitness sharing (according to non-domination), the last generation may contain Pareto-optimal points that is the goal of our search. 4.3 Disadvantages with some prior approach In VEGA we have k objective functions and M population. The size of each subpopulation will be M/k. Next step we shuffle and then use

Fig: VEGA Operator on them. After shuffling we never get the more fit value separately rather in VEGA we are almost averaging them. But our aim is to gradually get more fit value which is strictly followed in our technique. This type of problem arise in VEGA is called Middling performance. 2) NSGA has a lower overall performance and it seems to be more sensitive to the value of the sharing factor. 4.4 Strengths This approach include some computational strength:Page 258

Volume 1, Issue 3, September October 2012

Web Site: www.ijettcs.org Email: editor@ijettcs.org, editorijettcs@gmail.com Volume 1, Issue 3, September October 2012 ISSN 2278-6856
i. Fitness measure is done during the first step and it will be done using Min-Max formulation or Distance Function. (Some non-linear fitness measuring scheme should be accounted) ii. After first iteration, the procedure as NSGA can be applied to achieve more accurate result (i.e. it can be embedded into the classification phase of NSGA). iii. Some parameters, such as population number (M) and generation count (k 1) can be predicted. iv. For future implementations, niche method and crowding can easily be applied. v. Parallel implementation is also possible

5. IMPLEMENTATION
5.1 Algorithm to implement Initialize the population with random values For i=1 to MAXGENS Evaluate each subpopulation based on objective functions. assign shared fitness among subpopulation Rank on subpopulation based on shared fitness value(Best fit =highest rank ) 2 point Crossover between two consecutive subpopulation Merge step by step up to getting one final population. End Loop; End; 5.2 Test Functions Among the many of the known MOEA test functions, we implemented our approach on the following problem: F= (f1(x,y), f2(x,y)), where -5<=x, y<=10 f1(x,y)= x2+y2 , f2(x,y)=(x-5) 2 + (y-5) 2 From the obtained result it is evident that this method allows the function to converge very quickly.

5.3 Results Snapshots of different generations are given below. Function1 and Function 2 represents f1 and f2 respectively.

6. CONCLUSION
Even though there exists a number of classical Multi objective optimization techniques, they require some a priori problem information. Since genetic algorithm use a population of points, they may be able to find multiple Pareto-Optimal solutions simultaneously. Schaffers Vector Evaluated Algorithm (VEGA) and Debs Non dominated Sorting Genetic Algorithm (NSGA) show Volume 1, Issue 3, September October 2012 Page 259

Web Site: www.ijettcs.org Email: editor@ijettcs.org, editorijettcs@gmail.com Volume 1, Issue 3, September October 2012 ISSN 2278-6856
excellent results in many test cases, but still they are not free from some short comings. This new approach shows a new approach to solve Mult iobjective optimization problem. However we hope that this research will be a great success if carried out.
Bangladesh in 2006. His research interest is mainly focused on Semantic web, Social computing, Software Engineering, HCI, Image processing, Web Mining and Data & knowledge management. Currently he is a Lecturer in the Dept. of Computer Science and Engineering (CSE), Bangladesh University of Business and Technology (BUBT). T.M. Rezwanul Islam obtained his BSc degree in Computer Science and Information Technology from Islamic University of Technology (IUT), Gazipur, Bangladesh in 2011. He received the OIC (Organization of the Islamic Conference) scholarship for three years during his BSc studies. His research interest is mainly focused on AI, Evolutionary Computation, Software Engineering, HCI, Image processing, Web Mining, Ubiquitous Computing and Cognitive and Computational Neuroscience. Currently he is a Lecturer in the Dept. of Computer Science and Engineering (CSE), Bangladesh University of Business and Technology (BUBT). Sadat Maruf Hasnayen obtained his BSc degree in Computer Science and Information Technology from Islamic University of Technology (IUT), Gazipur, Bangladesh in 2011. He received the OIC (Organization of the Islamic Conference) scholarship for three years during his BSc studies. His research interest is mainly focused on AI, Evolutionary Computation, Software Engineering.

7. FUTURE PLAN
Our future plan will be measured the performance on the basis of tests like Implementation of Different Statistical Testing, Error Ratio (ER), Two set coverage (CS), Generational Distance (GD), Maximum Pareto Front Error (ME), Average Pareto Front Error (AE), Hyperarea and Ratio (H,HR) etc.

REFERENCES
[1] David E. Goldberg, "Genetic Algorithms in search, optimization and machine learning", Pearson Education Asia Ltd, New Delhi, 2000. [2] Michalewicz, Z.,"Genetic Algorithms + Data Structures = Evolution Programs", 3rd edn. Springer-Verlag, Berlin Heidelberg New York (1996). [3] Carlos A. Coello Coello, David A. Van Veldhuizen, Gary B. Lamont, " Evolutionary Algorithms for Solving Multi-Objective Problems", Kluwer Academic Publishers; ISBN: 0306467623, May 2002. [4] R. Sarker, M. Mohammadian and X. Yao, "Evolutionary Optimization, Management and Operation" Research Series, Kluwer Academic Publishers. [5] N.Srinivas and K.Deb, Multiobjective Optimization using Non-Dominated Sorting Genetic Algorithm, Kanpur Genetic Algorithm Laboratory (KanGAL), Indian Institute of Technology (IIT), Kanpur, India. [6] Deb.K(2001),"Genetic Algorithms for Optimization", KanGAL Report Number 2001002. [7] K.Deb, Single and Multi-Objective Optimization using Evolutionary Computation, Department of Mechanical Engineering, Kanpur Genetic Algorithm Laboratory (KanGAL), KanGAL report No. 2004002,Institute of Technology (IIT), Kanpur, India. [8] Shukla, P. K. and Deb, K. (August, 2005). On Finding Multiple Pareto-Optimal Solutions Using Classical and Evolutionary Generating Methods. KanGAL Report No. 2005006. AUTHORS
Md. Saddam Hossain Mukta obtained his M.sc degree in Computer science from University of Trento, Italy where he was receiving Opera Universita Scholarship and earned a B.Sc degree in Computer Science and Information Technology from Islamic University of Technology (IUT), Gazipur,

Volume 1, Issue 3, September October 2012

Page 260

Вам также может понравиться