Automatica

© All Rights Reserved

Просмотров: 5

Automatica

© All Rights Reserved

- Bubble and Dew Point Calculations in Multicomponent and Multireactive Mixtures
- Hill Climb 02
- Balakrishnan and Chakravarty - 2008 - Product Design With Multiple Suppliers for Compone
- Mayank Singh and Scott Schaefer- Triangle Surfaces with Discrete Equivalence Classes
- 02 PF#02 Basic Location Problem.ppt
- Genetic Algorithm in MATLAB
- Assignment 2
- Nylon_6_1998
- group technology journals
- Multi-objective Evolutionary Algorithms for Water Resources Management
- Module 1 Operations research
- A survey of lot sizing and scheduling models
- 5
- g 04502040047
- Cardoso 1997 Computers & Chemical Engineering
- Using Genetic Algorithm Minimizing Length of Air-Gap and Losses Along With Maximizing Efficiency for Optimization of Three Phase Induction Motor
- Nbi
- 281
- 2152.pdf
- 3FAIGA

Вы находитесь на странице: 1из 8

Almir Becirspahic

Almir Becirspahic

s186795

almir.becirspahic@polito.studenti.it

Politecnico di Torino

1. Introduction

Home Automation Systems consist of a number of appliances (washing machine, dishwasher, oven, refrigerator, boiler, )

which are connected by a communication network of some kind. The various elements of the system share common

resources (electricity, water and gas) to accomplish their specific tasks, according to user requirements. Since resources are

limited, in the sense that their consumption cannot exceed fixed thresholds, competition between elements of the system

generates internal conflicts. Using the information dispatched over the communication network and centralized or

decentralized computational resources, control strategies have to be implemented within the system, so to allocate limited

resources while maximizing performances. The definition of performance indices and that of the individual control strategies

and their tuning as well as the optimization procedure are the key issues in the HAS design.

Resources are allocated by implementing control strategies defined by means of suitable parameters. Performances should

be evaluated with respect to given scenarios, which specify, over a period of time, number, kind and scheduling of the tasks

the system is asked to perform and the users preferences. Clearly, avoiding overload situations may increase the delay and

this gives rise to the optimization problem that the HAS has to solve. This problem of optimization isnt a convex problem so

we cannot use the classical methods of optimization because our function is non-convex (see figure 1) and we can enter in a

local minimum or maximum. For solving this kind of optimization problems we have to use some suitable meta-heuristic

optimization techniques.

We have to find the value of parameters witch are the best possible compared with other parameters witch are around some

space, but maybe there are not the best absolute value. The key point is to define meta-heuristic optimizer to find a way to

choose new parameter starting from old one. Thats mean we start from some value of parameters, we compute the

performances of the system for that value of parameters, we modify the parameters and evaluate the performances and if

we have a improvement we keep the last value of parameters and again we modify looking for the better value.

Considering a very simple scenario with only two appliances with there parameters to1, ts1, to2, ts2 and we consider the same

scenario for both and it works in this way: we give a first random initial value to the parameters, we run simulation, we

evaluate the performance and if it better we modify the parameters and run simulation again (see figure 2).

META-HEURISTIC

OPTIMIZER

Responses

Input

parameters

SIMULATION

MODEL

2. Heuristics methods

The term heuristic is linked to algorithms mimicking some behavior found in nature, e.g., the principle of evolution through

selection and mutation (genetic algorithms), the annealing process of melted iron (simulated annealing) or the self

organization of ant colonies (ant colony optimization). In fact, a large number of heuristics results from such analogies

extending to the great deluge algorithm introduced by Dueck (1993). Heuristic optimization methods are essentially

computational and therefore they have been naturally introduced following the development of electronic computing devices.

First contributions go back to Bock (1958) and Croes (1958) who developed procedures to solve the traveling salesman

problem, but the most significant advances in the domain have been made in the late 1980s and 1990s when the main

techniques have been introduced. However, only very recently, desktop computers have reached the necessary performance

to make the use of these methods really appealing.

Today exist a lot off heuristic methods and techniques and its number is growing up. They are used for optimization

problems which cannot be tackled by classical methods. For accurate solution of the optimization problems it is necessary

to find a good heuristic technique which must satisfy certain conditions. First, a heuristic should be able to provide high

quality (stochastic) approximations to the global optimum. Second, a well behaved heuristic should be robust to changes in

problem characteristics, i.e. should not fit only a single problem instance, but the whole class. Also, it should not be too

sensitive with regard to tuning the parameters of the algorithm or changing some constraints on the search space. Third, a

heuristic should be easily implemented to many problem instances, including new ones. Finally, despite of its name, a

heuristic might be stochastic, but should not contain subjective elements.

Heuristics methods have a wide range of applications in economics. Optimization is at the core of econometric applications

to real data sets, e.g., in model selection and parameter estimation. Consequently, econometrics is a field with a long

tradition in applying the most up to date optimization techniques. Also this optimization methods are widely used in various

fields of engineering to solve non-convex problems that can not be solved by classical methods.

Optimization heuristics, which are sometimes also labeled approximation methods, are generally divided into two broad

classes, constructive methods also called greedy algorithms and local search methods. Greedy algorithms construct the

solution in a sequence of locally optimum choices. For long local search was not considered as a mature technique and it is

only recently that the method enjoys increasing interest. Some reasons for the resurgence of these methods are in particular

the dramatic increase in computational resources, the ease of implementation and their flexibility as well as their successful

application to many complex real-world problems. Also, recently more emphasis has been put to consider the solutions

obtained with these tools as point estimates and, consequently, to provide information about their distribution. This

contributes to a better understanding of the quality of a solution produced by heuristic methods. Local search uses only

information about the solutions in the neighborhood of a current solution and is thus very similar to hill climbing where the

choice of a neighbor solution locally maximizes a criterion. The classical local search method for minimizing a given objective

function f(x) can be formalized as presented in Algorithm 1.

Algorithm 1 Pseudo-code for the classical local search procedure.

end while

Hill-climbing uses information about the gradient for the selection of a neighbor xn in line 3 whereas local search algorithms

choose the neighbors according to some random mechanism. This mechanism as well as the criteria for acceptance in line

4, which are specific for a particular heuristic, define the way the algorithm walks through the solution space. The stopping

criteria often consists in a given number of iterations.

Local search methods are generally divided into trajectory methods which work on a single solution and population based

methods 1, where a whole set of solutions is updated simultaneously. In the first class we find threshold methods and tabu

search whereas the second class consists of genetic algorithms, differential evolution methods and ant colonies. All these

local search methods have particular rules for either or both, the choice of a neighbor and the rules for acceptance of a

solution. All methods, except tabu search, allow uphill moves, i.e. accept solutions which are worse than the previous one, in

order to escape local minima.

3. Trajectory methods

The definition of neighborhood is central to these methods and generally depends on the problem under consideration.

Finding efficient neighborhood functions that lead to high quality local optima can be challenging. We consider three

different trajectory methods.

These kind of methods define the maximum slope for up-hill moves in the succeeding iterations. We will consider the two

different kind of method: Simulated annealing (SA) and Threshold accepting (TA). The first method uses a probabilistic

acceptance criterion, while the maximum threshold is deterministic for the second.

Simulated annealing (SA) This refinement of the local search is based on an analogy between combinatorial optimization

and the annealing process of solids. Similar to the classical local search an improvement of the solution for a move from xc to

xn is always accepted. Moreover, the algorithm accepts also a move uphill, but only with a given probability. This probability

depends on the difference between the corresponding function values and on a global parameter T (called temperature), that

is gradually decreased during the process. In Algorithm 2, the rounds are implemented in line 2 and the reduction of the

probability by means of the parameter T is implemented in line 8. The stopping criterion is defined by a given number of

iterations or a number of consecutive iterations without change/improvement for the current solution xc.

Algorithm 2 Pseudo code for simulated annealing.

for r = 1 to Rmax do

end while

Reduce T

end for

Threshold accepting (TA) This method, suggested by Dueck and Scheuer (1990), is a deterministic analog of simulated

annealing where the sequence of temperatures T is replaced by a sequence of thresholds . As for SA, these threshold

values typically decrease to zero while r increases to Rmax. Algorithm 2 becomes then:

Tabu search is particularly designed for the exploration of discrete search spaces where the set of neighbor solutions is finite.

The method implements the selection of the neighborhood solution in a way to avoid cycling, i.e, visiting the same solution

more than once. This is achieved by employing a short term memory, known as the tabu list and which contains the

solutions which were most recently visited. In line 3 of Algorithm 3 the construction of set V may or may not imply the

examination of all neighbors of xc.

The simplest way to update the memory in line 6 is to remove older entries from the list. The stopping criterion is defined by

a given number of iterations or a number of consecutive iterations without improvement for the current solution xc.

Algorithm 3 Pseudo code for tabu search.

xc = xn and T = T xn

Update memory

end while

In contrast to the trajectory methods, these approaches work simultaneously on a whole set of solutions called population.

Therefore, population based methods might be more efficient with regard to exploring the whole search space at the cost of

a higher computational load and more complex structures.

These technique has been initially developed by Holland (1975).8 Genetic algorithms imitate the evolutionary process of

species that sexually reproduce. Thus, genetic algorithms might be considered the prototype of a population based method.

New candidates for the solution are generated with a mechanism called crossover which combines part of the genetic

patrimony of each parent and then applies a random mutation. If the new individual, called child, inherits good characteristics

from his parents it will have a higher probability to survive. The fitness of the child and parent population is evaluated in

function survive (statement 10) and the survivors can be formed either by the last generated individuals P, P { fittest

from P}, only the fittest from P or the fittest from P P.

The genetic algorithm first accepts a set of solutions (statement 3) and then constructs a set of neighbor solutions

(statements 410, Algorithm 4). In general, a predefined number of generation provides the stopping criterion.

Algorithm 4 Pseudo code for genetic algorithms.

for i = 1 to n do

P = P xchild

end for

P = survive( P , P)

end while

This heuristic, first introduced by Colorni, imitates the way ants search for food and find their way back to their nest. First an

ant explores its neighborhood randomly. As soon as a source of food is found it starts to transport food to the nest leaving

traces of pheromone on the ground which will guide other ants to the source. The intensity of the pheromone traces

depends on the quantity and quality of the food available at the source as well as from the distance between source and

nest, as for a short distance more ants will travel on the same trail in a given time interval. As the ants preferably travel along

important trails their behavior is able to optimize their work. Pheromone trails evaporate and once a source of food is

exhausted the trails will disappear and the ants will start to search for other sources. For the heuristic, the search area of the

ant corresponds to a discrete set from which the elements forming the solutions are selected, the amount of food is

associated with an objective function and the pheromone trail is modeled with an adaptive memory.

Algorithm 5 Pseudo code for ant colonies.

Initialize pheromone trail

end while

end for

for all ants do Update pheromone trail (more for better solutions)

end while

Differential evolution is a population based heuristic optimization technique for continuous objective functions. The algorithm

updates a population of solution vectors by addition, subtraction and crossover and then selects the fittest solutions among

the original and updated population. The initial population of nP randomly chosen solutions can be represented by a matrix

P(0) of dimension d x nP, where d is the dimension of the domain of the function (see figure 3).

The algorithm constructs nG generations of the population. A new generation is obtained in three steps. For each solution

Pi(0) , i = 1, . . . , nP, represented by a column of matrix P(0), the algorithm constructs two intermediate solutions P(v) and P(u) .,i

from three randomly selected columns P(0),r1, P(0),r2 and P(0),r3. The ith solution P(1),i of the new generation is assembled from

elements from P(0), P(v) and P(u). These particular steps are summarized in Algorithm 6.

Algorithm 6 Differential evolution

Initialize parameters nP , nG , F and CR

Initialize population Pj,i(1) , j = 1, . . . , d , i = 1, . . . , nP

for k = 1 to nG do

P(0) = P(1)

for i = 1 to nP do

Generate r1 , r2 , r3 { 1, . . . , nP} , r1 6 = r2 6 = r3 6 = i

Compute Pi(v) = Pr1(0) + F ~ (Pr2(0) Pr3(0) )

for j = 1 to d do

if u < CR then Pj,i(u) = Pj,i(v) else Pj,i(u) = Pj,i(0)

end for

if f (Pi(u) ) < f (Pi(0) ) then Pi(1) = Pi(u) else Pi(1) = Pi(0)

end for

end for

Particle swarm (PS) is a population based heuristic optimization technique for continuous functions which has been

introduced by Eberhart and Kennedy. The algorithm updates a population of solution vectors, called particles, by an

increment called velocity. Figure 1 illustrates the updating of the position of a particle P(k) i from generation k to generation

k+1. Two directions are considered, the direction from the current position of the particle to the best position so far found for

the particle (Pi(k) Pbesti ) and the direction from the current position to the best position so far for all particles

(Pi(k) Pbestgbest). Both directions are then randomly perturbed by multiplying them with a parameter c and a uniform

random variable u U(0, 1) (c.f. line 7 in Algorithm 7). A suggested value for the parameter c is 2. The sum vi of these two

randomized directions falls into a region characterized in the figure by a circle. The current velocity vi(k+1) i is then obtained by

updating vi(k) i with the increment vi.

Algorithm 7 summarizes the particular steps for the updating of the population of nP particles in nG succeeding generations.

Algorithm 7 Particle swarm

Initialize parameters nP , nG and c

Initialize particles Pi(0) and velocity vi(0) , i = 1, . . . , nP

Evaluate objective function Fi = f (Pi(0)), i = 1, . . . , nP

Pbest = P(0) , Fbest = F , Gbest = mini (Fi ), gbest = argmini (Fi )

for k = 1 to nG do

for i = 1 to nP do

vi = c u (Pbesti P(k1)i ) + c u (Pbestgbest P(k1)i )

vi(k) = v(k1) + vi

end for

Evaluate objective function Fi = f (Pi(k) ), i = 1, . . . , nP

for i = 1 to nP do

if Fi < Fbest i then Pbest i = Pi(k) and Fbest i = Fi

if Fi < Gbest then Gbest = Fi and gbest = i

end for

end for

Bibliography

[1] M.Gilli, P. Winker. Heuristic Optimization Methods in Econometrics. University of Geneva, October 2007.

[2] Manfred Gilli. An Introduction to Optimization Heuristics. University of Cyprus, September 2004.

[3] Pasi Airikka. Heuristic Evolutionary Random Optimizer for Non-linear Optimization Including Control and

Identification Problems. Finland.

[4] Premalatha K, Natarajan A M. Combined Heuristic Optimization Techniques for Global Minimization. Tamil

Nadu, India 2010.

[5] Michael Steinbrunn, Guido Moerkotte, Alfons Kemper. Heuristic and randomized optimization for the join

ordering problem. The VLDBJournal, 1997.

[6]

G. Conte,

[7]

Todd W. Neller. Heuristic optimization and dynamical system safety verification. Knowledge systems

- Bubble and Dew Point Calculations in Multicomponent and Multireactive MixturesЗагружено:Ankit Chugh
- Hill Climb 02Загружено:kmaurya125
- Balakrishnan and Chakravarty - 2008 - Product Design With Multiple Suppliers for ComponeЗагружено:lxtran
- Mayank Singh and Scott Schaefer- Triangle Surfaces with Discrete Equivalence ClassesЗагружено:Jemnasee
- 02 PF#02 Basic Location Problem.pptЗагружено:Desryadi Ilyas Mohammad
- Genetic Algorithm in MATLABЗагружено:Alejito Cuzco
- Assignment 2Загружено:koriom@live.com.au
- Nylon_6_1998Загружено:Cristiano Júnior
- group technology journalsЗагружено:Kumar Nori
- Multi-objective Evolutionary Algorithms for Water Resources ManagementЗагружено:Robert Aguedo
- Module 1 Operations researchЗагружено:Prabhu Chimmad
- A survey of lot sizing and scheduling modelsЗагружено:Clebson Cardoso
- 5Загружено:frankvega
- g 04502040047Загружено:International Journal of computational Engineering research (IJCER)
- Cardoso 1997 Computers & Chemical EngineeringЗагружено:Yang Gul Lee
- Using Genetic Algorithm Minimizing Length of Air-Gap and Losses Along With Maximizing Efficiency for Optimization of Three Phase Induction MotorЗагружено:MuhammadZaydArattKhan
- NbiЗагружено:issamoune
- 281Загружено:Sunny Sourav
- 2152.pdfЗагружено:Livia Marsa
- 3FAIGAЗагружено:John Pragasam
- 98JoGeroAIEDAMЗагружено:Milky
- 04776760Загружено:AbidAli
- Ammonia Reactor (2)Загружено:Rh Gladys
- DOE.pdfЗагружено:Naga Kiran
- 978-3-540-72432-2_46Загружено:MaharNadirAli
- CIB5429Загружено:Daniel1713
- Genetic algorithm optimization of composite and steel endplate semi-rigid jointsЗагружено:Cladilson Nardino
- A MULTI-OBJECTIVE OPTIMIZATION PROBLEM WITH ITS APPROACHES AND APPLICATIONSЗагружено:IJIERT-International Journal of Innovations in Engineering Research and Technology
- Sv 87 Industrial Management & Operations Research (r) Sem Vi May 2018Загружено:Akash Randive
- Case-Study_-Sabic-Shutdown-Optimisation.pdfЗагружено:Ariel

- High-Quality Figures in MATLABЗагружено:megustalazorra
- Aircraft Certification 2Загружено:Becirspahic Almir
- Aircraft Certification 1Загружено:Becirspahic Almir
- Doc-Fast-3102-1108-GBЗагружено:Becirspahic Almir
- Stronger Together en WebЗагружено:Mandar Chikate
- Report - Water WavesЗагружено:Becirspahic Almir
- Wave Resistance HWЗагружено:Becirspahic Almir
- HW-1-2014Загружено:Becirspahic Almir
- Exercise 2Загружено:Becirspahic Almir
- Wave Resistant HWЗагружено:Becirspahic Almir
- EOLreportЗагружено:Becirspahic Almir
- Report Rigo 4 AlmirЗагружено:Becirspahic Almir
- LEHM0088-01-1Загружено:Becirspahic Almir
- Lecture4 GeneticЗагружено:Becirspahic Almir
- CEVAC ReportЗагружено:Becirspahic Almir
- Windows 7 printed document.pdfЗагружено:Becirspahic Almir
- Seakeeping StepsЗагружено:Becirspahic Almir
- Fatigue Question and AnswerЗагружено:Becirspahic Almir
- SCT User Guide (1)Загружено:Priyesh Saxena
- EMship Course ContentЗагружено:Becirspahic Almir
- Report Rigo 4Загружено:Becirspahic Almir
- StabilityЗагружено:Becirspahic Almir
- AeroЗагружено:Becirspahic Almir
- HW-2-2014Загружено:Becirspahic Almir
- Wave Resistant HWЗагружено:Becirspahic Almir
- Report of CompositesЗагружено:Becirspahic Almir
- Report Rigo 4Загружено:Becirspahic Almir
- l&z TransformsЗагружено:Becirspahic Almir
- Project Team AЗагружено:Becirspahic Almir

- Reg LinealЗагружено:nacho2012
- Bragin AnEfficientApproachForSolvingMIPProbemsUnderMonotinicityCondition1Загружено:Gaurav Kumar Badhotiya
- optimizationЗагружено:dhiraj
- Assignment for Stat 376Загружено:Murali Dharan
- Proc Robust RegЗагружено:Ban-Hong Teh
- Section10 SolutionsЗагружено:MASHIAT MUTMAINNAH
- Corridor Variance Swap in Asia ScribdЗагружено:gaurav
- EC476 Slides Lecture 1Загружено:Hitesh Rangra
- Course OverviewЗагружено:Rocio Gill
- 1141-2249-1-SMЗагружено:hendra priadi
- Advertising Data SLRЗагружено:AbhishekKumar
- Store24AB.xlsЗагружено:Nikhil Jayan
- Lec18 Duality ThyЗагружено:SudheerKumar
- Mean, Median, And ModeЗагружено:Drone0001
- Data Hasil FtirЗагружено:Farida Kusumaningrum Parnudju
- Mima TutorialЗагружено:elmarjansen
- Econometrics for Dummies Cheat SheetЗагружено:elhoumamy
- Biostatistics and Research MethodologyЗагружено:Anonymous sF8ZuiG
- EconometricsЗагружено:Oladipupo Mayowa Paul
- Matla BodesЗагружено:azharzeb
- Stat 210Загружено:EminHadzi238
- Homework 6Загружено:Izzy Ramirez
- cvЗагружено:oloyede_wole3741
- 4 OD_LP_Duality and Sensitivity AnalysisЗагружено:carolinarvsocn
- Chapter 9 Testbank (1)Загружено:vx8550_373384312
- Design of Experiments - Wikipedia, The Free EncyclopediaЗагружено:Kusuma Zulyanto
- MATH_F212_1122Загружено:shivam12365
- Multiple Linear RegressionЗагружено:jsanderson81
- Graphical Method in LPЗагружено:Pranav Aggarwal
- Anova!kЗагружено:jklasd

## Гораздо больше, чем просто документы.

Откройте для себя все, что может предложить Scribd, включая книги и аудиокниги от крупных издательств.

Отменить можно в любой момент.