0 оценок0% нашли этот документ полезным (0 голосов)

208 просмотров11 страницNov 20, 2011

© Attribution Non-Commercial (BY-NC)

PDF, TXT или читайте онлайн в Scribd

Attribution Non-Commercial (BY-NC)

0 оценок0% нашли этот документ полезным (0 голосов)

208 просмотров11 страницAttribution Non-Commercial (BY-NC)

Вы находитесь на странице: 1из 11

Ali Haydar Kayhan

1

, Huseyin Ceylan

*

, M. Tamer Ayvaz

2

, Gurhan Gurarslan

2

Department of Civil Engineering, Pamukkale University, TR-20070 Denizli, Turkey

a r t i c l e i n f o

Keywords:

Particle swarm optimization

Hybridization

Spreadsheets

Solver

Optimization

a b s t r a c t

This study deals with a new hybrid globallocal optimization algorithm named PSOLVER that combines

particle swarm optimization (PSO) and a spreadsheet Solver to solve continuous optimization prob-

lems. In the hybrid PSOLVER algorithm, PSO and Solver are used as the global and local optimizers,

respectively. Thus, PSO and Solver work mutually by feeding each other in terms of initial and sub-initial

solution points to produce ne initial solutions and avoid from local optima. A comparative study has

been carried out to show the effectiveness of the PSOLVER over standard PSO algorithm. Then, six con-

strained and three engineering design problems have been solved and obtained results are compared

with other heuristic and non-heuristic solution algorithms. Identied results demonstrate that, the

hybrid PSOLVER algorithm requires less iterations and gives more effective results than other heuristic

and non-heuristic solution algorithms.

2010 Elsevier Ltd. All rights reserved.

1. Introduction

Optimization is the process of nding the best set of solutions to

achieve an objective subject to given constraints. It is a challenging

part of operations research and has a wide variety of applications

in economy, engineering and management sciences (Zahara &

Kao, 2009). During the last decades, huge number of solution algo-

rithms has been proposed for solving the optimization problems.

These algorithms may be mainly classied under two categories:

non-heuristic and heuristic algorithms. Non-heuristic algorithms

are mostly the gradient-based search methods and very efcient

in nding the local optimum solutions with a reasonable times.

However, they usually require gradient information to nd the

search directions (Lee & Geem, 2005). Thus, they may be inefcient

for solving the problems where the objective function and the

constraints are not differentiable. Therefore, there has been an

increasing interest to use the heuristic algorithms to solve the opti-

mization problems.

Heuristic optimization algorithms get their mathematical basis

fromthe natural phenomena. Most widely used heuristic optimiza-

tion algorithms are the genetic algorithms (GA) (Goldberg, 1989;

Holland, 1975), tabu search (TS) (Glover, 1977), simulated anneal-

ing (SA) (Kirkpatrick, Gelatt, & Vecchi, 1983), ant colony optimiza-

tion (ACO) (Dorigo & Di Caro, 1999), particle swarm optimization

(PSO) (Kennedy & Eberhart, 1995), and harmony search (HS)

(Geem, Kim, & Loganathan, 2001), etc. Although these algorithms

are very effective at exploring the search space, they require rela-

tively long time to precisely nd the local optimum (Ayvaz, Kay-

han, Ceylan, & Gurarslan, 2009; Fesanghary, Mahdavi, Minary-

Jolandan, & Alizadeh, 2008; Houck, Joines, & Kay, 1996; Houck,

Joines, & Wilson, 1997; Michalewicz, 1992).

Recently, hybrid globallocal optimization algorithms have be-

come popular solution approaches for solving the optimization

problems. These algorithms integrate the global exploring feature

of heuristic algorithms and local ne tuning feature of non-heuris-

tic algorithms. Through this integration, optimization problems

can be solved more effectively than both global and local optimiza-

tion algorithms (Shannon, 1998). In these algorithms, the global

optimization process searches the optimum solution with multiple

solution vectors, and then, local optimization process adjusts the

results of global optimization by getting its results as initial solu-

tion (Ayvaz et al., 2009). However, their main drawback is that pro-

gramming the non-heuristic optimization algorithms may be

difcult since they require some mathematical calculations such

as taking partial derivatives, calculating Jacobian and/or Hessian

matrices, taking matrix inversions, etc. Besides, they may require

an extra effort to handle the given constraint set through non-heu-

ristic algorithms.

Recently, popularity of spreadsheets in solving the optimization

problems has been increasing through their mathematical add-ins.

Most available spreadsheet packages are coupled with a Solver

0957-4174/$ - see front matter 2010 Elsevier Ltd. All rights reserved.

doi:10.1016/j.eswa.2010.03.046

* Corresponding author. Tel.: +90 258 296 3386; fax: +90 258 296 3382.

E-mail addresses: hkayhan@pamukkale.edu.tr (A.H. Kayhan), hceylan@

pamukkale.edu.tr (H. Ceylan), tayvaz@pamukkale.edu.tr (M.T. Ayvaz), gurarslan@

pamukkale.edu.tr (G. Gurarslan).

1

Tel.: +90 258 296 3393; fax: +90 258 296 3382.

2

Tel.: +90 258 296 3384; fax: +90 258 296 3382.

Expert Systems with Applications xxx (2010) xxxxxx

Contents lists available at ScienceDirect

Expert Systems with Applications

j our nal homepage: www. el sevi er . com/ l ocat e/ eswa

ARTICLE IN PRESS

Please cite this article in press as: Kayhan, A. H., et al. PSOLVER: A new hybrid particle swarm optimization algorithm for solving continuous optimization

problems. Expert Systems with Applications (2010), doi:10.1016/j.eswa.2010.03.046

add-in (Frontline System Inc., 1999) which can solve many nonlin-

ear optimization problems without requiring much knowledge

about the non-heuristic algorithms, and so are extremely easy to

use (Stokes & Plummer, 2004). Solver solves the optimization

problems through generalized reduced gradient (GRG) algorithm

(Lasdon, Waren, Jain, & Ratner, 1978) and can solve many linear

and nonlinear optimization problems (Ayvaz et al., 2009).

The main objective of this study is to develop a new hybrid glo-

ballocal optimization algorithm for solving the constrained opti-

mization problems. With this purpose, a new hybrid solution

algorithm, PSOLVER, is proposed. In the PSOLVER algorithm, PSO

is used as a global optimizer and integrated with a spreadsheet

Solver to improve the PSO results. The performance of the

PSOLVER algorithm is tested on several constrained optimization

problems and the results are compared with other solution meth-

ods in terms of solution accuracy and the number of function eval-

uations. Identied results showed that, PSOLVER algorithm

requires less number of function evaluations and gives more effec-

tive results than other solution algorithms.

The remaining of this study is organized as follows: First, the

main structure of PSO algorithm is described; second, the neces-

sary steps of building PSOLVER algorithm is presented; and nally,

the performance of the proposed model is tested on different con-

strained optimization problems.

2. The particle swarm optimization algorithm

The PSO algorithm, rst proposed by Kennedy and Eberhart

(1995), is developed based on the observations of the social behav-

ior of animals, such as bird ocking or sh schooling. Like other

evolutionary algorithms, PSO is also a population based optimiza-

tion algorithm. In PSO, members of the population are called as the

swarm and each individual within the swarm is called as the par-

ticle. During the solution process, each particle in the swarm ex-

plores the search space through their current positions and

velocities. In order to solve an optimization problem using PSO, ini-

tially, all the positions and velocities are randomly generated from

the feasible search space. Then, the velocity of each particle is up-

dated based on their individual experiences and experiences of the

other particles. This task is performed by updating the velocities of

each particle using the best position of the related particle and the

overall best position visited by the other particles. Finally, the posi-

tions of the particles are updated through their new velocities and

this process is iterated until the given termination criterion is sat-

ised. This solution sequence provides that each particle in the

swarm can learn based on their own experiences (local search)

and the experiences of the group (global search). Mathematical

statement of PSO algorithm can be given as follows:

Let f be the tness function governing the problem, n be the

number of particles in the swarm, m be the dimension of the prob-

lem (e.g. number of decision variables), x

i

x

i1

; x

i2

; . . . ; x

im

T

and

v

i

v

i1

; v

i2

; . . . ; v

im

T

be the vectors that contain the current posi-

tions and the velocities of the particles in each dimension,

^ x

i

^x

i1

; ^x

i2

; . . . ; ^x

im

T

be the vector that contains the current best

position of each particle in each dimension, and

^

g g

1

;

g

2

; . . . ; g

m

T

be the vector that contains the global best position in

each dimension (8i 1; 2; . . . ; n and 8j 1; 2; . . . ; m), T be the

transpose operator. The new velocities of the particles are calcu-

lated as follows:

v

k1

i

xv

k

i

c

1

r

1

^ x

i

x

k

i

_ _

c

2

r

2

^

g x

k

i

_ _

8i 1; 2; . . . ; n 1

where k is the iteration index, x is the inertial constant, c

1

and c

2

are the acceleration coefcients which are used to determine how

much the particles personal best and the global best inuence its

movement, and r

1

and r

2

are the uniform random numbers between

0 and 1. Note that the values of x; c

1

and c

2

control the impact of

previous historical values of particle velocities on its current one. A

larger value of x leads to global exploration, whereas smaller val-

ues results with a ne search within the solution space. Therefore,

suitable selection of x; c

1

and c

2

provides a balance between the

global and local search processes (Salman, Ahmad, & Al-Madani,

2002). Note that the terms c

1

r

1

^ x

i

x

k

i

_ _

and c

2

r

2

^

g x

k

i

_ _

in Eq. (1)

are called the cognition and social terms, respectively. The cognition

term takes into account only the particles own experience, whereas

the social term signies the interaction between the particles. Parti-

cles velocities in a swarm are usually bounded with a maximum

velocity v

max

v

max

1

; v

max

2

; . . . ; v

max

m

_

T

which is calculated as a frac-

tion of the entire search space as follows (Shi & Eberhart, 1998):

v

max

cx

max

x

min

2

where c is a fraction 0 6 c < 1; x

max

x

max

1

; x

max

2

; . . . ; x

max

m

_

T

and

x

min

x

min

1

; x

min

2

; . . . ; x

min

m

_

T

are the vectors that contain the upper

and lower bounds of the search space for each dimension, respec-

tively. After the velocity updating process is performed through

Eqs. (1) and (3), the new positions of the particles are calculated

as follows:

x

k1

i

x

k

i

v

k1

i

8i 1; 2; . . . ; n 3

After the calculation of Eq. (3), the corresponding tness values are

calculated based on the new positions of the particles. Then, the val-

ues of ^ x

i

and

^

g 8i 1; 2; . . . ; n are updated. This solution proce-

dure is repeated until the given termination criterion has been

satised. Fig. 1 shows the step by step solution procedure of PSO

algorithm (Wikipedia, 2009).

The PSO has been applied to wide variety of disciplines includ-

ing neural network training (Eberhart & Hu, 1999; Eberhart & Ken-

nedy, 1995; Kennedy & Eberhart, 1995, 1997; Salerno, 1997; Van

Den Bergh & Engelbrecht, 2000), biochemistry (Cockshott & Hart-

man, 2001), manufacturing (Tandon, El-Mounayri, & Kishawy,

2002), electromagnetism (Baumgartner, Magele, & Renhart, 2004;

Brandsttter & Baumgartner, 2002; Ciuprina, Loan, & Munteanu,

2002), electrical power (Abido, 2002; Yoshida, Fukuyama, Takay-

ama, & Nakanishi, 1999), optics (Slade, Ressom, Musavi, & Miller,

2004), structural optimization (Fourie & Groenwold, 2002; Perez

& Behdinan, 2007; Venter & Sobieszczanski-Sobieski, 2004), end

milling (Tandon, 2000) and structural reliability (Elegbede, 2005).

Generally, it can be said that PSO is applicable to solve the most

optimization problems.

3. Development of hybrid PSOLVER algorithm

As indicated above, PSO is an efcient optimization algorithm

and successively applied to the solution of optimization problems.

However, like other heuristic optimization algorithms, PSO is also

an evolutionary computation technique and may require high

computational times to precisely nd an exact optimum. There-

fore, hybridizing the PSO with a local search method becomes a

good idea such that PSO nds the possible solutions where the glo-

bal optimum exists, and local search method employs a ne search

to precisely nd the global optimum. This kind of solution ap-

proach makes the convergence rate faster than the pure global

search and prevents the problem of trapping to local optimums

by pure local search (Fan & Zahara, 2007).

The current literature includes several studies in which the PSO

algorithm is integrated with the local search methods. Fan, Liang,

and Zahara (2004) developed a hybrid optimization algorithm

which integrates the PSO and NelderMead (NM) simplex search

method for the optimization of multimodal test functions. Their re-

sults showed that the NMPSO algorithm is superior to other

search methods. Victoire and Jeyakumar (2004) integrated the

2 A.H. Kayhan et al. / Expert Systems with Applications xxx (2010) xxxxxx

ARTICLE IN PRESS

Please cite this article in press as: Kayhan, A. H., et al. PSOLVER: A new hybrid particle swarm optimization algorithm for solving continuous optimization

problems. Expert Systems with Applications (2010), doi:10.1016/j.eswa.2010.03.046

PSO algorithm with the sequential quadratic programming (SQP)

technique for solving the economic dispatch problems. In their

PSOSQP algorithm, PSO is used as the global optimizer and SQP

is used as the local optimizer which is used for ne tuning the each

solution of PSO. They tested their model performance on three dif-

ferent economic dispatch problems. Results showed that their

PSOSQP algorithm provides better solutions than those of other

solution methods. Kazuhiro, Shinji, and Masataka (2006) combined

the PSO and sequential linear programming (SLP) to solve the

structural optimization problems. Their results showed that hybrid

PSOSLP nds very efcient results. Ghaffari-Miab, Farmahini-

Farahani, Faraji-Dana, and Lucas (2007) developed a hybrid solu-

tion algorithm which integrates the PSO and gradient-based qua-

si-Newton method. They applied their hybrid model to the

solution of complex time Greens functions of multilayer media.

Their results indicated that hybrid PSO algorithm is superior com-

pared to other optimization techniques. Zahara and Hu (2008)

developed a hybrid NMPSO algorithm for solving the constrained

optimization problems. Their NMPSO algorithm handles con-

straint sets by using both gradient repair and constraint tness pri-

ority-based ranking operators. According to their results, NMPSO

with embedded constraint handling operators is extremely effec-

tive and efcient at locating optimal solutions. As a later study,

Zahara and Kao (2009) applied the NMPSO algorithm of Zahara

and Hu (2008) to the solution of engineering design problems with

a great success.

As summarized above, hybridizing the PSO algorithm with local

search methods is an effective and efcient way to solve the opti-

mization problems. However, programming these hybrid algo-

rithms may be a difcult task for the non-major people since

most of the local search methods require some complex mathe-

matical calculations. Therefore, in this study, PSO is hybridized

with a spreadsheet Solver since it requires little knowledge about

the programming of local search methods.

Solver is a powerful gradient-based optimization add-in and the

most commercial spreadsheet products (Lotus 1-2-3

, Quattro

Pro

, Microsoft Excel

linear optimization problems through GRG algorithm (Lasdon

et al., 1978). It works by rst evaluating the functions and deriva-

tives at a starting value of the decision vector, and then iteratively

searches for a better solution using a search direction suggested by

derivatives (Stokes & Plummer, 2004). To determine a search direc-

tion, Solver uses the quasi-Newton and conjugate gradient meth-

ods. Note that the user is not required to provide the partial

derivatives with respect to decision variables in Solver. Instead,

forward or central difference approximations are used in the

search process (OTC, 2009). This may be the main advantage of

using Solver as a local optimizer in this study.

It should be noted that the global optimizer PSO and the local

optimizer Solver have been integrated by developing a running Vi-

sual Basic for Applications (VBA) code on the background of a

spreadsheet platform (Excel for this study). In this integration,

two separate running VBA codes have been developed. The rst

code includes the standard PSO algorithm and is used as the global

optimizer. The second code is used for calling the Solver add-in and

developed by creating a VBA macro instead of manually calling the

Solver add-in. Note that a macro is a series of commands grouped

together as a single command to accomplish a task automatically

and can be created through macro recorder that saves the series

of commands in VBA (Ferreira & Salcedo, 2001). The source code

of the recorded macro can be easily modied in the Visual Basic

Editor of the spreadsheets (Ferreira & Salcedo, 2001; Microsoft,

1995; Rosen, 1997). By using this feature of the spreadsheets, the

recorded Solver macro is integrated with the developed PSO code

on VBA platform.

Note that, dealing with the use of a spreadsheet Solver as a local

optimizer, Ayvaz et al. (2009) rstly proposed a hybrid optimiza-

tion algorithm in which HS and the Solver is integrated to solve

engineering optimization problems. With this purpose, they devel-

oped a hybrid HSSolver algorithm. They tested the performance of

HSSolver algorithm on 4 unconstrained, 4 constrained and 4

structural engineering problems. Their results indicated that hy-

brid HSSolver algorithm requires less number of function evalua-

tions and nds better or identical objective function values than

many non-heuristic and heuristic optimization algorithms.

It should be noted that Fesanghary et al. (2008) mentions about

two approaches of integrating global and local search processes. In

the rst approach, global search process explores the entire search

space until the objective function improvement is negligible, and

then, local search method performs a ne search by taking the best

solution of global search as a starting point. On the other hand, in

the second approach, both global and local search processes work

simultaneously such that all the solutions of global search are ne

tuned by local search. When the optimized solution of local search

has a better objective function value than the global search, this

solution is transferred to global search and solution proceeds until

Fig. 1. Step by step solution procedure of PSO algorithm.

A.H. Kayhan et al. / Expert Systems with Applications xxx (2010) xxxxxx 3

ARTICLE IN PRESS

Please cite this article in press as: Kayhan, A. H., et al. PSOLVER: A new hybrid particle swarm optimization algorithm for solving continuous optimization

problems. Expert Systems with Applications (2010), doi:10.1016/j.eswa.2010.03.046

the given termination criterion satised (Fesanghary et al., 2008;

Ayvaz et al., 2009). Compared the rst and second approaches, it

is obvious that the second approach provides better results than

the rst approach. However, computational cost of the second ap-

proach is usually higher than the rst one since all the solutions of

global search will be subject to local search. Note that the second

approach is taken into account in this study and PSO and Solver

optimizers are integrated based on a probability of P

c

such that a

globally generated solution vector is subjected to local search with

a probability of P

c

. Note that our trials and the recommendations of

Fesanghary et al. (2008) and Ayvaz et al. (2009) state that use of a

fairly small P

c

value is sufcient for solving many optimization

problems. Therefore, we have used the probability P

c

0:01

throughout the paper. After given convergence criteria of the Sol-

ver are satised, the locally improved solution is included to PSO

and the global search proceeds until termination. Fig. 2 shows

the step by step procedure of the PSOLVER algorithm.

4. Numerical applications

In this section, performance of the PSOLVER algorithm is tested

by solving several constrained optimization problems. However,

before solving these examples, it may be essential to show the ef-

ciency of PSOLVER over the standard PSO algorithm. With this pur-

pose, a performance evaluation study has been performed by

solving a common unconstrained optimization problem using both

PSOLVER and standard PSO algorithms. Then, six constrained

benchmark problems and three well-known engineering design

problems have been solved and the results have been compared

with other non-heuristic and heuristic optimization algorithms.

The related solution parameters of PSOLVER algorithm were set

as follows: the number of particles is set to n 21m 1 (Zahara &

Kao, 2009), the acceleration coefcients are set to c

1

c

2

2, the

inertia factor is x 0:5 rand0; 1=2:0 (Eberhart & Shi, 2001;

Hu & Eberhart, 2001), the maximum velocity of particles v

max

0:1x

max

x

min

and the Solver run probability is set as P

c

0:01.

All the examples have been solved 30 times for different random

number seeds to show the robustness of the algorithm. Note that

two stopping criteria have been considered such that the optimiza-

tion process ends when the number of generations equals to 1000

or the reference or a better solution has been obtained.

4.1. Performance evaluation study: Michalewiczs test function

Michalewiczs function is a typical example of nonlinear multi-

modal functions including n! local optima (Michalewicz, 1992).

The function can be given as follows:

Min f x

n

k1

sinx

k

sin

k x

2

k

p

_ _ _ _

2s

_ _

4

s:t: 0 6 x

k

6 p; k 1; 2; . . . ; n 4a

where the parameter s denes the steepness of the valleys or

edges and assumed to be 10 for this solution. This function has a

global optimum solution of f x

shows the solution space of the function when n 2.

Fig. 2. Step by step solution procedure of PSOLVER algorithm.

0

1

2

3

4

0

1

2

3

4

-2

-1.5

-1

-0.5

0

0.5

1

X1

X2

( ) f x

2

x

1

x

Fig. 3. Michalewiczs test function.

4 A.H. Kayhan et al. / Expert Systems with Applications xxx (2010) xxxxxx

ARTICLE IN PRESS

Please cite this article in press as: Kayhan, A. H., et al. PSOLVER: A new hybrid particle swarm optimization algorithm for solving continuous optimization

problems. Expert Systems with Applications (2010), doi:10.1016/j.eswa.2010.03.046

It can be clearly seen from Fig. 3 that, solution of this function

using a gradient-based optimization algorithm is quite difcult

task since there are many locations where the gradient of the func-

tion equals to zero. Therefore, solving this problem through gradi-

ent-based algorithms depends on the quality of the initial

solutions. In order to test the performance of PSOLVER algorithm,

this problem has been solved using both PSOLVER and standard

PSO algorithms. Note that same random number seeds have been

used. Thus, same initial solutions have been used in both algo-

rithms. Fig. 4 compares the convergence histories of both

algorithms.

As can be seen from Fig. 4, both algorithms are started from the

same initial solution. Although both PSO and hybrid PSOLVER algo-

rithms nd the optimum solution of f x

4:687658, the

PSOLVER requires much less function evaluations than PSO.

PSOLVER requires only 456 function evaluations, whereas PSO re-

quires 67,600 function evaluations to solve the same problem.

4.2. Example 1

The rst minimization problem, which includes 13 decision

variables and nine inequality constraints, is given in Eq. (5):

Min f x 5

4

i1

x

i

5

4

i1

x

2

i

13

i5

x

i

5

s:t: g

1

x 2x

1

2x

2

x

10

x

11

10 6 0 5a

g

2

x 2x

1

2x

3

x

10

x

12

10 6 0 5b

g

3

x 2x

2

2x

3

x

11

x

12

10 6 0 5c

g

4

x 8x

1

x

10

6 0 5d

g

5

x 8x

2

x

11

6 0 5e

g

6

x 8x

3

x

12

6 0 5f

g

7

x 2x

4

x

5

x

10

6 0 5g

g

8

x 2x

6

x

7

x

11

6 0 5h

g

9

x 2x

8

x

9

x

12

6 0 5i

0 6 x

i

6 1; i 1; 2; 3; . . . ; 9 5j

0 6 x

i

6 100; i 10; 11; 12 5k

0 6 x

i

6 1; i 13 5l

The optimal solution of this problem is at x

1; 1; 1; 1; 1; 1; 1; 1; 1;

3; 3; 3; 1 with a corresponding function value of f(x

*

) = 15. This

function was previously solved using Evolutionary Algorithm (EA)

(Runarsson & Yao, 2005), Cultural Differential Evolution (CDE) (Bec-

erra & Coello, 2006), Filter Simulated Annealing (FSA) (Hedar &

Fukushima, 2006), GA (Chootinan & Chen, 2006), and NMPSO

(Zahara & Hu, 2008) methods. After applying the PSOLVER algo-

rithm to this problem, we obtained the best solution at x

1; 1;

1; 1; 1; 1; 1; 1; 1; 3; 3; 3; 1 with the corresponding objective value of

f(x

*

) = 15.000000. Table 1 compares the identied results for dif-

ferent solution algorithms.

As can be seen from Table 1, while the optimum solution were

obtained using GA, EA and NMPSO algorithms after 95,512,

122,000 and 41,959 function evaluations, respectively, the PSOLV-

ER algorithm requires only 679 function evaluations. Therefore, the

PSOLVER algorithm is the most effective solution method among

the other methods in terms of the number of function evaluations.

4.3. Example 2

This minimization problem has two decision variables and two

inequality constraints as given in Eq. (6):

Min f x x

1

10

3

x

2

20

3

6

s:t: g

1

x x

1

5

2

x

2

5

2

100 6 0 6a

g

2

x x

1

6

2

x

2

5

2

82:81 6 0 6b

13 6 x

1

6 100 6c

0 6 x

2

6 100 6d

This function has an optimal solution at x

a corresponding function value of f(x

*

) = 6961.81388. This prob-

lem was previously solved using EA (Runarsson & Yao, 2005), CDE

(Becerra & Coello, 2006), FSA (Hedar & Fukushima, 2006), GA

(Chootinan & Chen, 2006), and NMPSO (Zahara & Hu, 2008) meth-

ods. Among those studies, the best solution was reported by Zahara

and Hu (2008) with an objective function value of f(x

*

) =

6961.8240 using NMPSO algorithm after 9856 iterations. We ap-

plied the PSOLVER algorithm to this problem and obtained the opti-

mum solution at x

objective function value is f(x

*

) = 6961.8244. This solution is ob-

tained after 179 iterations. The comparison of the identied results

for different solution algorithms is given in Table 2. It can be clearly

seen from Table 2 that the PSOLVER algorithm provides a better

solution than the other solution algorithms with fewer number of

function evaluations.

4.4. Example 3

The third example has 3 constraints and 5 decision variables.

These are:

Fig. 4. Convergence history of PSO and PSOLVER.

A.H. Kayhan et al. / Expert Systems with Applications xxx (2010) xxxxxx 5

ARTICLE IN PRESS

Please cite this article in press as: Kayhan, A. H., et al. PSOLVER: A new hybrid particle swarm optimization algorithm for solving continuous optimization

problems. Expert Systems with Applications (2010), doi:10.1016/j.eswa.2010.03.046

Min f x e

x

1

x

2

x

3

x

4

x

5

7

s:t: g

1

x x

2

1

x

2

2

x

2

3

x

2

4

x

2

5

10 0 7a

g

2

x x

2

x

3

5x

4

x

5

0 7b

g

3

x x

3

1

x

3

2

1 0 7c

2:3 6 x

i

6 2:3; i 1; 2 7d

3:2 6 x

i

6 3:2; i 3; 4; 5 7e

For this problem, the optimum solution is x

1:717143;

1:595709; 1:827247; 0:7636413; 0:763645 where f(x

*

) = 0.0539

498. This problem was previously solved using EA (Runarsson &

Yao, 2005), CDE (Becerra & Coello, 2006), FSA (Hedar & Fukushima,

2006), and NMPSO (Zahara & Hu, 2008) methods. Table 3 shows

the optimal solutions of PSOLVER and the previous solution

algorithms.

As can be seen from the Table 3, NMPSO and PSOLVER algo-

rithms give the best result with the objective function value of

f(x

*

) = 0.053949. It should be note that the lowest standard devia-

tion, which is observed with PSOLVER algorithm, demonstrates

its higher robustness in comparison with the other algorithms.

The best solution vector x

0:763605; 0:763594 has been obtained after 779 function eval-

uations with PSOLVER while the NMPSO algorithm requires

265548.

4.5. Example 4

This example has 5 decision variables and 6 inequality con-

straints as given in Eq. (8):

Min f x 5:3578547x

3

3

0:8356891x

1

x

5

37:293239x

1

40792:141 8

s:t: g

1

x 85:334407 0:0056858x

2

x

5

0:0006262x

1

x

4

0:0022053x

3

x

5

92 6 0 8a

g

2

x 85:334407 0:0056858x

2

x

5

0:0006262x

1

x

4

0:0022053x

3

x

5

6 0 8b

g

3

x 80:51249 0:0071317x

2

x

5

0:0029955x

1

x

2

0:0021813x

2

3

110 6 0 8c

g

4

x 80:51249 0:0071317x

2

x

5

0:0029955x

1

x

2

0:0021813x

2

3

90 6 0 8d

g

5

x 9:300961 0:0047026x

3

x

5

0:0012547x

1

x

3

0:0019085x

3

x

4

25 6 0 8e

g

6

x 9:300961 0:0047026x

3

x

5

0:0012547x

1

x

3

0:0019085x

3

x

4

20 6 0 8f

78 6 x

1

6 102 8g

33 6 x

2

6 45 8h

27 6 x

i

6 45; i 3; 4; 5 8i

The optimal solution of the problem is at x

78; 33;

29:995256025682; 45; 36:775812905788 with a corresponding

function value of f x

solved by using a homomorphous mapping (HM) (Koziel & Mich-

alewicz, 1999), Stochastic Ranking (SR) (Runarsson & Yao, 2000),

evolutionary programming (EP) (Coello &Becerra, 2004), hybrid par-

ticle swarm optimization (HPSO) (He & Wang, 2007), and NMPSO

(Zahara &Kao, 2009). Among those studies, the best solution was ob-

Table 1

Comparison of the identied results for Example 1.

Methods Best objective

function value

Mean objective

function value

Worst objective

function value

Standard

deviation

Number of

function evaluations

EA (Runarsson & Yao, 2005) 15.000000 15.000000 15.000000 0 122,000

CDE (Becerra & Coello, 2006) 15.000000 14.999996 14.999993 0.000002 100,100

FSA (Hedar & Fukushima, 2006) 14.999105 14.993316 14.979977 0.004813 205,748

GA (Chootinan & Chen, 2006) 15.000000 15.000000 15.000000 0 95,512

NMPSO (Zahara & Hu, 2008) 15.000000 15.000000 15.000000 0 41,959

PSOLVER 15.000000 15.000000 15.000000 0 679

Table 2

Comparison of the identied results for Example 2.

Methods Best objective

function value

Mean objective

function value

Worst objective

function value

Standard

deviation

Number of function

evaluations

EA (Runarsson & Yao, 2005) 6961.8139 6961.8139 6961.8139 0 56,000

CDE (Becerra & Coello, 2006) 6961.8139 6961.8139 6961.8139 0 100,100

FSA (Hedar & Fukushima, 2006) 6961.8139 6961.8139 6961.8139 0 44,538

GA (Chootinan & Chen, 2006) 6961.8139 6961.8139 6961.8139 0 13,577

NMPSO (Zahara & Hu, 2008) 6961.8240 6961.8240 6961.8240 0 9856

PSOLVER 6961.8244 6961.8244 6961.8244 0 179

Table 3

Comparison of the identied results for Example 3.

Methods Best objective

function value

Mean objective

function value

Worst objective

function value

Standard

deviation

Number of function

evaluations

EA (Runarsson & Yao, 2005) 0.053942 0.111671 0.438804 1.40E01 109,200

CDE (Becerra & Coello, 2006) 0.056180 0.288324 0.392100 1.67E01 100,100

FSA (Hedar & Fukushima, 2006) 0.053950 0.297720 0.438851 1.89E01 120,268

NMPSO (Zahara & Hu, 2008) 0.053949 0.054854 0.058301 1.26E03 265,548

PSOLVER 0.053949 0.053950 0.053950 1.14E07 779

6 A.H. Kayhan et al. / Expert Systems with Applications xxx (2010) xxxxxx

ARTICLE IN PRESS

Please cite this article in press as: Kayhan, A. H., et al. PSOLVER: A new hybrid particle swarm optimization algorithm for solving continuous optimization

problems. Expert Systems with Applications (2010), doi:10.1016/j.eswa.2010.03.046

tained by He and Wang (2007) using HPSO algorithm with an objec-

tive function value of f x

We obtained the best solution using PSOLVER algorithm at

x

corresponding objective value of f x

pares the identied results of different solution algorithms. It can be

seen in Table 4 that PSOLVER gives the same result with SR, HPSO,

NMPSO and better than HM and EP. It should be note that the hy-

brid PSOLVER requires only 328 function evaluations which is much

less in comparison with the other methods.

4.6. Example 5

The fth example has two decision variables and two inequality

constraints as given in Eq. (9):

Max f x

sin

3

2px

1

sin2px

2

x

3

1

x

1

x

2

9

s:t: g

1

x x

2

1

x

2

1 6 0 9a

g

2

x 1 x

1

x

2

4

2

6 0 9b

0 6 x

1

6 10 9c

0 6 x

2

6 10 9d

This function has the global optimum at x

1:2279713;

4:2453733 with a corresponding function value of f x

0:095825. This function was previously solved by using a HM (Koz-

iel & Michalewicz, 1999), SR (Runarsson & Yao, 2000), EP (Coello &

Becerra, 2004), HPSO (He & Wang, 2007), and NMPSO (Zahara &

Kao, 2009). We applied the PSOLVER algorithm to the solution of

this problem and obtained the optimum solution at x

1:2279713; 4:2453733 with a corresponding objective function

value of f x

tion evaluations. The comparison of the identied results for differ-

ent solution algorithms are given in Table 5. It can be clearly seen

from Table 5 that the PSOLVER algorithm nds optimal solution

with the lowest number of function evaluations among those of

the other algorithms.

4.7. Example 6

This maximization problem has 3 decision variables and 1

inequality constraints as given in Eq. (9d):

Max f x

100 x

1

5

2

x

2

5

2

x

3

5

2

100

10

s:t: gx x

1

p

2

x

2

q

2

x

3

r

2

0:0625 6 0 10a

0 6 x

i

6 10 i 1; 2; 3 and p; q; r 1; 2; . . . ; 9 10b

For this example, the feasible region of the search space consists of

9

3

disjoint spheres. A point x

1

; x

2

; x

3

is feasible if and only if there

exist p; q; r such that the above inequality holds (Zahara & Kao,

2009). For this problem, the optimum solution is x

5; 5; 5 with

f x

& Michalewicz, 1999), SR (Runarsson & Yao, 2000), EP (Coello & Bec-

erra, 2004), HPSO (He & Wang, 2007), and NMPSO (Zahara & Kao,

2009). Table 6 shows the identied results of PSOLVER and the pre-

vious studies given above.

As can be seen from Table 6, PSOLVER algorithm results with

x

1. PSOLV-

ER algorithm requires only 584 function evaluations for obtaining

the optimal solution.

Table 4

Comparison of the identied results for Example 4.

Methods Best objective

function value

Mean objective

function value

Worst objective

function value

Standard

deviation

Number of function

evaluations

HM (Koziel & Michalewicz, 1999) 30,664.500 30,665.300 30,645.900 N/A 1,400,000

SR (Runarsson & Yao, 2000) 30,665.539 30,665.539 30,665.539 0.0000200 350,000

EP (Coello & Becerra, 2004) 30,665.500 30,662.500 30,662.200 9.3000000 50,020

HPSO (He & Wang, 2007) 30,665.539 30,665.539 30,665.539 0.0000017 81,000

NMPSO (Zahara & Kao, 2009) 30,665.539 30,665.539 30,665.539 0.0000140 19,568

PSOLVER 30,665.539 30,665.539 30,665.539 0.0000024 328

Table 5

Comparison of the identied results for Example 5.

Methods Best objective

function value

Mean objective

function value

Worst objective

function value

Standard

deviation

Number of

function evaluations

HM (Koziel & Michalewicz, 1999) 0.095825 0.089157 0.029144 N/A 1,400,000

SR (Runarsson & Yao, 2000) 0.095825 0.095825 0.095825 2.6E17 350,000

EP (Coello & Becerra, 2004) 0.095825 0.095825 0.095825 0 50,020

HPSO (He & Wang, 2007) 0.095825 0.095825 0.095825 1.2E10 81,000

NMPSO (Zahara & Kao, 2009) 0.095825 0.095825 0.095825 3.5E08 2103

PSOLVER 0.095825 0.095825 0.095825 2.7E12 308

Table 6

Comparison of the identied results for Example 6.

Methods Best objective

function value

Mean objective

function value

Worst objective

function value

Standard

deviation

Number of function

evaluations

HM (Koziel & Michalewicz, 1999) 0.999999 0.999135 0.991950 N/A 1,400,000

SR (Runarsson & Yao, 2000) 1.000000 1.000000 1.000000 0 350,000

EP (Coello & Becerra, 2004) 1.000000 0.996375 0.996375 9.7E03 50,020

HPSO (He & Wang, 2007) 1.000000 1.000000 1.000000 1.6E15 81,000

NMPSO (Zahara & Kao, 2009) 1.000000 1.000000 1.000000 0 923

PSOLVER 1.000000 1.000000 1.000000 2.6E14 584

A.H. Kayhan et al. / Expert Systems with Applications xxx (2010) xxxxxx 7

ARTICLE IN PRESS

Please cite this article in press as: Kayhan, A. H., et al. PSOLVER: A new hybrid particle swarm optimization algorithm for solving continuous optimization

problems. Expert Systems with Applications (2010), doi:10.1016/j.eswa.2010.03.046

4.8. Example 7: The tension/compression string design problem

The tension/compression string design problem is described in

Arora (1989) and the aim is to minimize the weight f x of a ten-

sion/compression spring (as shown in Fig. 5) subject to constraints

on minimum deection, shear stress, surge frequency, limits on

outside diameter and on design variables. The design variables

are the wire diameter dx

1

, the mean coil diameter Dx

2

and

the number of active coils Px

3

.

The mathematical formulation of this problem can be described

as follows:

Min f x x

3

2x

2

x

2

1

11

s:t: g

1

x 1

x

3

2

x

3

71; 785x

4

1

6 0 11a

g

2

x

4x

2

2

x

1

x

2

12; 566 x

2

x

3

1

x

4

1

_ _

1

5108x

2

1

1 6 0 11b

g

3

x 1

140:45x

1

x

2

2

x

3

6 0 11c

g

4

x

x

2

x

1

1:5

1 6 0 11d

0:05 6 x

1

6 2:00 11e

0:25 6 x

2

6 1:30 11f

2:00 6 x

3

6 15:00 11g

This problem has been used as a benchmark for testing different

optimization methods, such as GA based co-evolution model

(GA1) (Coello, 2000), GA through the use of dominance-based tour-

nament selection (GA2) (Coello & Montes, 2002), EP (Coello & Bec-

erra, 2004), co-evolutionary particle swarm optimization approach

(CPSO) (He & Wang, 2006), HPSO (He & Wang, 2007), and NM

PSO (Zahara & Kao, 2009). After applying the PSOLVER algorithm

to this problem, best solution is obtained at x 0:05186;

0:356650; 11:292950 with the corresponding value of f x

0:0126652. The best solutions obtained by the above-mentioned

methods and the PSOLVER algorithm are given in Tables 7 and 8.

It can be seen from Table 8 that the standard deviation of the

PSOLVER solution is the smallest. In addition, the PSOLVER requires

only 253 function evaluations for solving this problem, while GA2,

HPSO, CPSO, and NMPSO require 80,000, 81,000, 200,000, and

80,000 function evaluations, respectively. Therefore, the PSOLVER

is an efcient approach locating the global optimum for this

problem.

4.9. Example 8: The welded beam design problem

This design problem, which has been often used as a benchmark

problem, was rstly proposed by Coello (2000). In this problem, a

welded beam is designed for minimum cost subject to constraints

on shear stress s; bending stress r in the beam; buckling load

on the bar P

b

; end deection of the beam d; and side con-

straints. There are four design variables as shown in Fig. 6:

hx

1

; lx

2

; tx

3

and bx

4

.

The mathematical formulation of the problem is as follows:

Min f x 1:10471x

2

1

x

2

0:04811x

3

x

4

14 x

2

12

s:t: g

1

x sx s

max

6 0 12a

g

2

x rx r

max

6 0 12b

g

3

x x

1

x

4

6 0 12c

g

4

x 0:10471x

2

1

0:04811x

3

x

4

14 x

2

5 6 0 12d

g

5

x 0:125 x

1

6 0 12e

g

6

x dx d

max

6 0 12f

g

7

x P P

c

x 6 0 12g

0:1 6 x

1

; x

4

6 2:0 12h

0:1 6 x

2

; x

3

6 10:0 12i

Fig. 5. A tension/compression string design problem.

Table 7

Comparison of the best solutions for tension/compression spring design problem.

Methods x

1

d x

2

D x

3

P

b

f(x)

GA1 (Coello, 2000) 0.051480 0.351661 11.632201 0.0127048

GA2 (Coello & Montes, 2002) 0.051989 0.363965 10.890522 0.0126810

EP (Coello & Becerra, 2004) 0.050000 0.317395 14.031795 0.0127210

CPSO (He & Wang, 2006) 0.051728 0.357644 11.244543 0.0126747

HPSO (He & Wang, 2007) 0.051706 0.357126 11.265083 0.0126652

NMPSO (Zahara & Kao, 2009) 0.051620 0.355498 11.333272 0.0126302

PSOLVER 0.051686 0.356650 11.292950 0.0126652

Table 8

Statistical results for tension/compression spring design problem.

Methods Best objective

function value

Mean objective

function value

Worst objective

function value

Standard

deviation

Number of

function evaluations

GA1 (Coello, 2000) 0.0127048 0.0127690 0.0128220 3.94E05 N/A

GA2 (Coello & Montes, 2002) 0.0126810 0.0127420 0.0129730 5.90E05 80,000

EP (Coello & Becerra, 2004) 0.0127210 0.0135681 0.0151160 8.42E04 N/A

CPSO (He & Wang, 2006) 0.0126747 0.0127300 0.0129240 5.20E04 200,000

HPSO (He & Wang, 2007) 0.0126652 0.0127072 0.0127190 1.58E05 81,000

NMPSO (Zahara & Kao, 2009) 0.0126302 0.0126314 0.0126330 8.74E07 80,000

PSOLVER 0.0126652 0.0126652 0.0126652 2.46E09 253

8 A.H. Kayhan et al. / Expert Systems with Applications xxx (2010) xxxxxx

ARTICLE IN PRESS

Please cite this article in press as: Kayhan, A. H., et al. PSOLVER: A new hybrid particle swarm optimization algorithm for solving continuous optimization

problems. Expert Systems with Applications (2010), doi:10.1016/j.eswa.2010.03.046

where

sx

s

0

2

2s

0

s

00

x

2

2R

s

00

2

_

12j

s

0

P

2

p

x

1

x

2

; s

00

MR

J

; M P L

x

2

2

_ _

12k

R

x

2

2

4

x

1

x

3

2

_ _2

_

12l

J 2

2

p

x

1

x

2

x

2

2

12

x

1

x

3

2

_ _2

_ _ _ _

12m

rx

6PL

x

4

x

2

3

; dx

4PL

3

Ex

3

3

x

4

12n

P

c

x

4:013E

x

2

3

x

6

4

36

_

L

2

1

x

3

2L

E

4G

_ _ _

12o

P 6000 lb; L 14 in:; E 30 10

6

psi; G 12 10

6

psi

12p

s

max

13; 600 psi; r

max

30; 000 psi; d

max

0:25 in: 12q

The methods previously applied to this problem include GA1 (Coel-

lo, 2000), GA2 (Coello & Montes, 2002), EP (Coello & Becerra, 2004),

CPSO (He & Wang, 2006), HPSO (He & Wang, 2007), and NMPSO

(Zahara & Kao, 2009). Among those studies, the best solution was

obtained by using NMPSO (Zahara & Hu, 2008) with an objective

function value of f x 1:724717 after 80,000 function evaluations.

We applied the PSOLVER algorithm to this problem and obtained

the best solution of f x 1:724717. The comparison of the identi-

ed results is given in Tables 9 and 10, respectively.

From Table 9, it can be seen that the best solutions found by the

PSOLVER is same with the NM-PSO and better than those obtained

by the other methods. Standard deviation of the results by the

PSOLVER is the smallest. Note that the average number of function

evaluations of the PSOLVER is 297. Therefore, it can be said that the

PSOLVER is the most efcient among the previous methods.

5. Example 9: The pressure vessel design problem

In pressure vessel design problem, proposed by Kannan and

Kramer (1994), the aim is to minimize the total cost, including

the cost of material, forming and welding. A cylindrical vessel is

capped at both ends by hemispherical heads as shown in Fig. 7.

The are four design variables in this problem: T

s

(x

1

, thickness of

the shell), T

h

(x

2

, thickness of the head), R (x

3

, inner radius) and L

(x

4

, length of the cylindrical section of the vessel). Among the four

design variables, T

s

and T

h

are expected to be integer multiples of

0.0625 in., and R and L are continuous variables.

The problem can be formulated as follows (Kannan & Kramer,

1994):

Min f x 0:6224x

1

x

3

x

4

1:7781x

2

x

2

3

3:1661x

2

1

x

4

19:84x

2

1

x

3

13

s:t: g

1

x x

1

0:0193x

3

6 0 13a

g

2

x x

2

0:00954x

3

6 0 13b

Fig. 6. The welded beam design problem.

Table 9

Comparison of the best solutions for welded beam design problem.

Methods x

1

h x

2

l x

3

t x

4

b f(x)

GA1 (Coello, 2000) 0.208800 3.420500 8.997500 0.210000 1.748309

GA2 (Coello & Montes, 2002) 0.205986 3.471328 9.020224 0.206480 1.728226

EP (Coello & Becerra, 2004) 0.205700 3.470500 9.036600 0.205700 1.724852

CPSO (He & Wang, 2006) 0.202369 3.544214 9.048210 0.205723 1.728024

HPSO (He & Wang, 2007) 0.205730 3.470489 9.033624 0.205730 1.724852

NMPSO (Zahara & Kao, 2009) 0.205830 3.468338 9.033624 0.205730 1.724717

PSOLVER 0.205830 3.468338 9.036624 0.205730 1.724717

Table 10

Statistical results for welded beam design problem.

Methods Best objective

function value

Mean objective

function value

Worst objective

function value

Standard

deviation

Number of function

evaluations

GA1 (Coello, 2000) 1.748309 1.771973 1.785835 1.12E02 N/A

GA2 (Coello & Montes, 2002) 1.728226 1.792654 1.993408 7.47E02 80,000

EP (Coello & Becerra, 2004) 1.724852 1.971809 3.179709 4.43E01 N/A

CPSO (He & Wang, 2006) 1.728024 1.748831 1.782143 1.29E02 200,000

HPSO (He & Wang, 2007) 1.724852 1.749040 1.814295 4.01E02 81,000

NMPSO (Zahara & Kao, 2009) 1.724717 1.726373 1.733393 3.50E03 80,000

PSOLVER 1.724717 1.724717 1.724717 1.62E11 297

Fig. 7. Pressure vessel design problem.

A.H. Kayhan et al. / Expert Systems with Applications xxx (2010) xxxxxx 9

ARTICLE IN PRESS

Please cite this article in press as: Kayhan, A. H., et al. PSOLVER: A new hybrid particle swarm optimization algorithm for solving continuous optimization

problems. Expert Systems with Applications (2010), doi:10.1016/j.eswa.2010.03.046

g

3

x px

2

3

x

4

4

3

px

3

3

1; 296; 000 6 0 13c

g

4

x x

4

240 6 0 13d

0 6 x

1

; x

2

6 100 13e

10 6 x

3

; x

4

6 200 13f

This problem has been solved before by using previously mentioned

GA1 (Coello, 2000), GA2 (Coello & Montes, 2002), CPSO (He & Wang,

2006), HPSO (He & Wang, 2007), and NMPSO (Zahara & Kao, 2009).

Their best solutions were compared against those produced by the

PSOLVER and given in Tables 11 and 12, respectively.

In this problem, decision variables x

1

and x

2

are expected to be

integer multiples of 0.0625 in. Previous best solutions obtained by

the other methods, except NMPSO, satisfy those constraints. As

obviously seen from Table 11, values of x

1

and x

2

given for NM

PSO are not integer multiples of 0.0625 in. Therefore, the HPSO

and the PSOLVER methods give the best results by considering

the NMPSO solution is not feasible for this problem. It should

be note that, the PSOLVER requires about only 310 function evalu-

ations for obtaining a feasible solution while GA2, HPSO, CPSO, and

NMPSO require 80,000, 81,000, 200,000, and 80,000 tness func-

tion evaluations, respectively. In addition, the standard deviation

of the results by PSOLVER is the smallest. Considering the statisti-

cal and comparisonal results, it can be concluded that PSOLVER is

more efcient than the other methods for pressure vessel design

problem.

6. Conclusion

In this study, a new hybrid globallocal optimization algo-

rithm is proposed which combines the PSO with a spreadsheet

Solver for solving continuous optimization problems. In the

proposed PSOLVER algorithm, PSO is used as a global optimizer

and Solver is used as a local optimizer. During the optimization

process, the PSO and Solver work mutually by feeding each other

in terms of initial and sub-initial solution points. With this pur-

pose, a VBA code has been developed on the background of Excel

spreadsheet to provide the integration of the PSO and Solver pro-

cesses. Main advantages of the PSOLVER over standard PSO algo-

rithm is demonstrated within a comparative study and then six

constrained and three engineering design problems have been

solved by using the PSOLVER algorithm. Results showed that

the proposed algorithm provides better solutions than the other

heuristic and non-heuristic optimization techniques in terms of

objective function values and number of function evaluations.

The most important contribution of the proposed hybrid PSOLV-

ER algorithm is that it requires much less iterations than other

solution approaches. It should be note that the spreadsheet appli-

cations may require long run-time since the data processing is

executed on the PC screen, this deciency can be overcome by

deactivating the screen updating property during the optimiza-

tion process. Finally, the PSOLVER algorithm may be useful to ap-

ply to the real world optimization problems which need

signicant computational efforts.

References

Abido, M. A. (2002). Optimal power ow using particle swarm optimization.

International Journal of Electrical Power and Energy Systems, 24(7), 563571.

Arora, J. S. (1989). Introduction to optimum design. New York: McGraw-Hill.

Ayvaz, M. T., Kayhan, A. H., Ceylan, H., & Gurarslan, G. (2009). Hybridizing harmony

search algorithm with a spreadsheet solver for solving continuous engineering

optimization problems. Engineering Optimization, 41(12), 11191144.

Baumgartner, U., Magele, Ch., & Renhart, W. (2004). Pareto optimality and particle

swarm optimization. IEEE Transactions on Magnetics, 40(2), 11721175.

Becerra, R. L., & Coello, C. A. C. (2006). Cultured differential evolution for constrained

optimization. Computer Methods in Applied Mechanics and Engineering, 195(33

36), 43034322.

Brandsttter, B., & Baumgartner, U. (2002). Particle swarm optimization-mass

spring system analogon. IEEE Transactions on Magnetics, 38(2), 9971000.

Chootinan, P., & Chen, A. (2006). Constraint handling in genetic algorithms using a

gradient-based repair method. Computers and Operations Research, 33(8),

22632281.

Ciuprina, G., Loan, D., & Munteanu, I. (2002). Use of intelligent-particle swarm

optimization in electromagnetics. IEEE Transactions on Magnetics, 38(2),

10371040.

Cockshott, A. R., & Hartman, B. E. (2001). Improving the fermentation medium for

Echinocandin B production part II: Particle swarm optimization. Process

Biochemistry, 36, 661669.

Coello, C. A. C. (2000). Use of a self-adaptive penalty approach for engineering

optimization problems. Computers in Industry, 41(2000), 113127.

Coello, C. A. C., & Becerra, R. L. (2004). Efcient evolutionary optimization through

the use of a cultural algorithm. Engineering Optimization, 36(2), 219236.

Coello, C. A. C., & Montes, E. M. (2002). Constraint-handling in genetic algorithms

through the use of dominance-based tournament selection. Advanced

Engineering Informatics, 16(2002), 193203.

Dorigo, M., & Di Caro, G. (1999). Ant colony optimisation: A new meta-heuristic. In

Proceedings of the congress on evolutionary computation (Vol. 2, pp. 14701477).

Eberhart, R. C., & Hu, X. (1999). Human tremoe analysis using particle swarm

optimization. In Proceedings of the congress on evolutionary computation,

Washington, DC, USA (pp. 19271930).

Eberhart, R. C., & Kennedy, J. (1995). A new optimizer using particle swarm theory.

In Proc. of the sixth int. symp. micro machine and human science, Nagoya, Japan

(pp. 39-43).

Table 11

Comparison of the best solutions for pressure vessel design problem.

Methods x

1

Ts x

2

T

h

x

3

R x

4

L f(x)

GA1 (Coello, 2000) 0.8125 0.4375 40.3239 200.0000 6288.7445

GA2 (Coello & Montes, 2002) 0.8125 0.4375 42.0974 176.6540 6059.9463

CPSO (He & Wang, 2006) 0.8125 0.4375 42.0913 176.7465 6061.0777

HPSO (He & Wang, 2007) 0.8125 0.4375 42.0984 176.6366 6059.7143

NMPSO (Zahara & Kao, 2009) 0.8036 0.3972 41.6392 182.4120 5930.3137

PSOLVER 0.8125 0.4375 42.0984 176.6366 6059.7143

Table 12

Statistical results for pressure vessel design problem.

Methods Best objective

function value

Mean objective

function value

Worst objective

function value

Standard

deviation

Number of function

evaluations

GA1 (Coello, 2000) 6288.7445 6293.8432 6308.1497 7.413E+00 N/A

GA2 (Coello & Montes, 2002) 6059.9463 6177.2533 6469.3220 1.309E+02 80,000

CPSO (He & Wang, 2006) 6061.0777 6147.1332 6363.8041 8.645E+01 200,000

HPSO (He & Wang, 2007) 6059.7143 6099.9323 6288.6770 8.620E+01 81,000

NMPSO (Zahara & Kao, 2009) 5930.3137 5946.7901 5960.0557 9.161E+00 80,000

PSOLVER 6059.7143 6059.7143 6059.7143 4.625E12 310

10 A.H. Kayhan et al. / Expert Systems with Applications xxx (2010) xxxxxx

ARTICLE IN PRESS

Please cite this article in press as: Kayhan, A. H., et al. PSOLVER: A new hybrid particle swarm optimization algorithm for solving continuous optimization

problems. Expert Systems with Applications (2010), doi:10.1016/j.eswa.2010.03.046

Eberhart, R. C., & Shi, Y. (2001). Tracking and optimizing dynamic systems with

particle swarms. In Proceedings of congress on evolutionary computation, Seoul,

Korea (pp. 2730).

Elegbede, C. (2005). Structural reliability assessment based on particle swarm

optimization. Structural Safety, 27(2), 171186.

Fan, S. S., Liang, Y. C., & Zahara, E. (2004). Hybrid simplex search and particle swarm

optimization for the global optimization of multimodal functions. Engineering

Optimization, 36(4), 401418.

Fan, S. S., & Zahara, E. (2007). A hybrid simplex search and particle swarm

optimization for unconstrained optimization. European Journal of Operational

Research, 181, 527548.

Ferreira, E. N. C., & Salcedo, R. (2001). Can spreadsheet solvers solve demanding

optimization problems. Computer Applications in Engineering Education, 9(1),

4956.

Fesanghary, M., Mahdavi, M., Minary-Jolandan, M., & Alizadeh, Y. (2008).

Hybridizing harmony search algorithm with sequential quadratic

programming for engineering optimization problems. Computer Methods in

Applied Mechanics and Engineering, 197, 30803091.

Fourie, P. C., & Groenwold, A. A. (2002). The particle swarm optimization algorithm

in size and shape optimization. Structural and Multidisciplinary Optimization,

23(4), 259267.

Frontline System Inc. (1999). A tutorial on spreadsheet optimization.

Geem, Z. W., Kim, J. H., & Loganathan, G. V. (2001). A new heuristic optimization

algorithm: harmony search. Simulation, 76(2), 6068.

Ghaffari-Miab, M., Farmahini-Farahani, A., Faraji-Dana, R., & Lucas, C. (2007). An

efcient hybrid Swarm intelligence-gradient optimization method for complex

time Greens functions of multilayer media. Progress in Electromagnetics

Research, 77, 181192.

Glover, F. (1977). Heuristic for integer programming using surrogate constraints.

Decision Sciences, 8(1), 156166.

Goldberg, D. E. (1989). Genetic algorithms in search, optimization, and machine

learning. Addison-Wesley.

Hedar, A. D., & Fukushima, M. (2006). Derivative-free lter simulated annealing

method for constrained continuous global optimization. Journal of Global

Optimization, 35(4), 521549.

He, Q., & Wang, L. (2006). An effective co-evolutionary particle swarm optimization

for engineering optimization problems. Engineering Application of Articial

Intelligence, 20, 8999.

He, Q., & Wang, L. (2007). A hybrid particle swarm optimization with a feasibility

based rule for constrained optimization. Applied Mathematics and Computation,

186, 14071422.

Holland, J. H. (1975). Adaptation in natural and articial systems: An introductory

analysis with applications to biology, control, and articial intelligence. Ann Arbor:

University of Michigan Press.

Houck, C. R., Joines, J. A., & Kay, M. G. (1996). Comparison of genetic algorithms,

random start, and two-opt switching for solving large locationallocation

problems. Computers and Operations Research, 23(6), 587596.

Houck, C. R., Joines, J. A., & Wilson, J. R. (1997). Empirical investigation of the

benets of partial Lamarckianism. Evolutionary Computation, 5(1), 3160.

Hu, X., & Eberhart, R. C. (2001). Tracking dynamic systems with PSO: Wheres the

cheese? In Proceedings of workshop on particle swarm optimization, Indianapolis,

USA.

Kannan, B. K., & Kramer, S. N. (1994). An augmented lagrange multiplier based

method for mixed integer discrete continuous optimization and its applications

to mechanical design. Journal of Mechanical Design, 116, 318320.

Kazuhiro, I., Shinji, N., & Masataka, Y. (2006). Hybrid swarm optimization

techniques incorporating design sensitivities. Transactions of the Japan Society

of Mechanical Engineers, 72(719), 22642271.

Kennedy, J., & Eberhart, R. C. (1995). Particle swarm optimization. In Proceedings of

the IEEE international conference on neural networks, Piscataway, USA (pp. 1942

1948).

Kennedy, J., & Eberhart, R. C. (1997). A discrete binary version of the particle swarm

algorithm. In Proc. IEEE int. conf. systems, man. and cybernetics (Vol. 5, pp. 4104

4108).

Kirkpatrick, S., Gelatt, C., & Vecchi, M. (1983). Optimization by simulated annealing.

Science, 220(4598), 671680.

Koziel, S., & Michalewicz, Z. (1999). Evolutionary algorithms, homomorphous

mappings, and constrained parameter optimization. Evolutionary Computation,

7, 1944.

Lasdon, L. S., Waren, A. D., Jain, A., & Ratner, M. (1978). Design and testing of a

generalized reduced gradient code for nonlinear programming. ACM

Transactions on Mathematical Software, 4(1), 3449.

Lee, K. S., & Geem, Z. W. (2005). A new meta-heuristic algorithm for continuous

engineering optimization: Harmony search theory and practice. Computer

Methods in Applied Mechanics and Engineering, 194, 39023933.

Michalewicz, Z. (1992). Genetic algorithm + data structure = evolution programs. New

York: Springer-Verlag.

Microsoft. (1995). Microsoft Excel Visual Basic for applications. Washington:

Microsoft Press.

OTC. (2009). Optimization Technology Centers Web site (online). <http://www-

fp.mcs.anl.gov/OTC/Guide/SoftwareGuide/Blurbs/grg2.html> (accessed

31.03.2009).

Perez, R. E., & Behdinan, K. (2007). Particle swarm approach for structural design

optimization. Computers and Structures, 85, 15791588.

Rosen, E. M. (1997). Visual Basic for applications, Add-Ins and Excel 7.0. CACHE News

(Vol. 45, pp. 13).

Runarsson, T. P., & Yao, X. (2000). Stochastic ranking for constrained evolutionary

optimization. IEEE Transactions on Evolutionary Computation, 4(3), 284292.

Runarsson, T. P., & Yao, X. (2005). Search biases in constrained evolutionary

optimization. IEEE Transactions on Systems, Man and Cybernetics, 35(2), 233243.

Salerno, J. (1997). Using particle swarm optimization technique to train a recurrent

neural model. In Proc. of the ninth IEEE int. conf. tools and articial intelligence,

USA (pp. 4549).

Salman, A., Ahmad, I., & Al-Madani, S. (2002). Particle swarm optimization for task

assignment problem. Microprocessors and Microsystems, 26, 363371.

Shannon, M. W. (1998). Evolutionary algorithms with local search for combinatorial

optimization. PhD thesis, University of California, San Diego.

Shi, Y., & Eberhart, R. C. (1998). Parameter selection in particle swarm optimization.

Evolutionary Programming VII. Lecture Notes in Computer Science (Vol. 1447).

Berlin: Springer.

Slade, W. H., Ressom, H. W., Musavi, M. T., & Miller, R. L. (2004). Inversion of ocean

color observations using particle swarm optimization. IEEE Transactions on

Geoscience and Remote Sensing, 42(9), 19151923.

Stokes, L., & Plummer, J. (2004). Using spreadsheet solvers in sample design.

Computational Statistics and Data Analysis, 44(3), 527546.

Tandon, V. (2000). Closing the gap between CAD/CAM and optimized CNC end milling.

MSc thesis, Purdue School of Engineering and Technology, Indianapolis, USA.

Tandon, V., El-Mounayri, H., & Kishawy, H. (2002). NC end milling optimization

using evolutionary computation. International Journal of Machine Tools and

Manufacture, 42(5), 595605.

Van Den Bergh, F., & Engelbrecht, A. P. (2000). Cooperative learning in neural

networks using particle swarm optimizers. South African Computer Journal, 26,

8490.

Venter, G., & Sobieszczanski-Sobieski, J. (2004). Multidisciplinary optimization of a

transport aircraft wing using particle swarm optimization. Structural and

Multidisciplinary Optimization, 26(12), 121131.

Victoire, T. A., & Jeyakumar, A. E. (2004). Hybrid PSOSQP for economic dispatch

with valve-point effect. Electric Power Systems Research, 71(1), 5159.

Wikipedia. (2009). The free Encyclopedia web site (online). <http://

www.en.wikipedia.org/wiki/Particle_swarm_optimization> (accessed

31.03.2009).

Yoshida, H., Fukuyama, Y., Takayama, S., & Nakanishi, Y. (1999). A particle swarm

optimization for reactive power and voltage control in electric power systems

considering voltage security assessment. In Proc. IEEE int. conf. systems, man. and

cybernetics, Tokyo, Japan (pp. 497502).

Zahara, E., & Hu, C. H. (2008). Solving constrained optimization problems with

hybrid particle swarm optimization. Engineering Optimization, 40(11),

10311049.

Zahara, E., & Kao, Y. T. (2009). Hybrid NelderMead simplex search and particle

swarm optimization for constrained engineering design problems. Expert

Systems with Applications, 36, 38803886.

A.H. Kayhan et al. / Expert Systems with Applications xxx (2010) xxxxxx 11

ARTICLE IN PRESS

Please cite this article in press as: Kayhan, A. H., et al. PSOLVER: A new hybrid particle swarm optimization algorithm for solving continuous optimization

problems. Expert Systems with Applications (2010), doi:10.1016/j.eswa.2010.03.046

## Гораздо больше, чем просто документы.

Откройте для себя все, что может предложить Scribd, включая книги и аудиокниги от крупных издательств.

Отменить можно в любой момент.