Вы находитесь на странице: 1из 13

Hindawi

Mathematical Problems in Engineering


Volume 2018, Article ID 2517460, 12 pages
https://doi.org/10.1155/2018/2517460

Research Article
Dynamic Reduction-Expansion Operator to
Improve Performance of Genetic Algorithms for
the Traveling Salesman Problem

Santiago-Omar Caballero-Morales , Jose-Luis Martinez-Flores ,


and Diana Sanchez-Partida
Universidad Popular Autonoma del Estado de Puebla, A.C., Postgraduate Department of Logistics and Supply Chain Management,
17 Sur 711, Barrio de Santiago, Puebla, PUE 72410, Mexico

Correspondence should be addressed to Santiago-Omar Caballero-Morales; santiagoomar.caballero@upaep.mx

Received 10 January 2018; Revised 7 July 2018; Accepted 24 July 2018; Published 2 September 2018

Academic Editor: Erik Cuevas

Copyright © 2018 Santiago-Omar Caballero-Morales et al. This is an open access article distributed under the Creative Commons
Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is
properly cited.

The Traveling Salesman Problem (TSP) is an important routing problem within the transportation industry. However, finding
optimal solutions for this problem is not easy due to its computational complexity. In this work, a novel operator based on dynamic
reduction-expansion of minimum distance is presented as an initial population strategy to improve the search mechanisms of
Genetic Algorithms (GA) for the TSP. This operator, termed as 𝑅𝑒𝑑𝐸𝑥𝑝, consists of four stages: (a) clustering to identify candidate
supply/demand locations to be reduced, (b) coding of clustered and nonclustered locations to obtain the set of reduced locations, (c)
sequencing of minimum distances for the set of reduced locations (nearest neighbor strategy), and (d) decoding (expansion) of the
reduced set of locations. Experiments performed on TSP instances with more than 150 nodes provided evidence that 𝑅𝑒𝑑𝐸𝑥𝑝 can
improve convergence of the GA and provide more suitable solutions than other approaches focused on the GA’s initial population.

1. Introduction time is the edge’s length [4]. Then, the objective of solving the
TSP consists on minimizing the total distance of a complete
As defined by [1], routing is the process of selecting “best” sequence of paths (total route) which starts and finishes at
routes in a graph 𝐺 = (𝑉, 𝐴), where 𝑉 is a node set and a specific vertex (i.e., depot node) after having visited all
𝐴 is an arc set. Within this context, route planning is the vertexes once and only once. Figure 1 presents a solution
calculation of the most effective route (route of minimum example for the TSP which is also known as a Hamiltonian
distance, cost, or travel time) from an origin to a destination Circuit of minimum cost.
node on a network, and the Traveling Salesman Problem (TSP) Finding optimal solutions for the TSP is a challenging
is one of the most studied and applied routing models in task due to its computational complexity which is defined
the transportation, manufacturing, and logistic industries [2]. as NP-hard (nondeterministic polynomial-time hard) [5]. In
As presented by [3] the TSP “is the fundamental problem example, if 15 cities are considered, there are 1.31e+12 ways of
in the fields of computer science, engineering, operations performing a Hamiltonian Circuit to visit them. In such case,
research, discrete mathematics, graph theory, and so forth”. finding the optimal solution (i.e., finding the Hamiltonian
This is the reason why the TSP has frequently been considered Circuit of minimum cost) can be a time-consuming task
a touchstone for new strategies and algorithms to solve which becomes infeasible when larger number of cities is
combinatorial optimization problems as commented by [2]. considered. As reported in [2] only small TSP instances (up
The TSP can be modeled as an undirected weighted graph to approximately 100 nodes) can be solved to optimality.
where locations (i.e., nodes) are the graph’s vertexes, paths are Due to this situation, development of metaheuristics
the graph’s edges (i.e., arcs), and the path’s distance, cost, or has been performed to provide high-quality solutions in
2 Mathematical Problems in Engineering

instances (with 𝑁 = [52 − 442], mean = 204 nodes) and 10


trials this strategy led to a mean average error of 9.22% (mean
best error of 6.97%).
As presented in [2], clustering can improve the perfor-
mance of GA for the TSP. However, the distribution patterns
of the nodes may affect the performance of the clustering
and declustering processes by increasing variability in the
initial population. This is because nodes that represent key
features of the complete set of nodes can be missed by
performing the clustering process, leading to their removal
in the reduced (i.e., clustered) set of nodes. An example
of key nodes is presented in Figure 2(a) where data from
the TSP instance a280 of the TSPLIB 95 database [9] was
considered. As presented, the distribution of these nodes is an
Demand/Supply important feature in its optimal solution which is presented
Node
Depot in Figure 2(b).
Node By performing the clustering presented in Figure 2(c) the
distribution of the key nodes of instance a280 is simplified. As
Route
a consequence, the optimal solution’s pattern of the reduced
Figure 1: Distribution network modeled by the TSP. set of nodes (Figure 2(d)) is significantly different from the
pattern observed for the complete set (Figure 2(b)). Note
that, as presented in Figure 2(e), the pattern observed in
a reasonable time for different combinatorial optimization
Figure 2(d) is preserved even after declustering.
problems such as the TSP [6]. Among the most efficient
In order to address this issue, the proposed 𝑅𝑒𝑑𝐸𝑥𝑝
metaheuristics for the TSP the following can be mentioned:
operator considers clustering of only two nodes, and only
Genetic Algorithms (GA), Particle Swarm Optimization
nodes which are very close to each other are candidates for
(PSO), Tabu Search (TS), Simulated Annealing (SA), Ant
clustering. This leads to a not much relaxed TSP to reduce the
Colony Optimization (ACO), and Artificial Neural Networks
loss of key features. The number of pairs of nodes which are
(ANNs) [3, 6].
candidates for clustering is defined by a dynamic acceptance
Although GA is one of the most important metaheuris-
threshold metric. This process is defined as “reduction” and a
tics applied for the TSP, its performance depends on its
route of minimum distance is estimated by a greedy heuristic.
parameters settings such as initial population, selection and
Then, “expansion” of the clustered nodes is performed to
reproduction operators, and stop condition. As presented in
represent the route considering the original 𝑁 nodes. This
[2, 7, 8] the quality of the initial population plays an important
strategy was evaluated with a selection of 41 symmetric TSP
role in the solving mechanism of the GA. In the present work,
instances (with 𝑁 = [51 − 1432], mean = 474 nodes) and
a dynamic reduction-expansion operator, termed as 𝑅𝑒𝑑𝐸𝑥𝑝,
considering six scenarios where 𝑅𝑒𝑑𝐸𝑥𝑝 could be used alone
is presented as a strategy to improve the quality of the initial
or in conjunction with other standard processes to generate
population and the convergence of a GA. When compared to
an initial population. Initial assessment of the 𝑅𝑒𝑑𝐸𝑥𝑝
other approaches on GA the 𝑅𝑒𝑑𝐸𝑥𝑝 operator can provide
operator was performed with a single execution (trial) of
more suitable solutions for the TSP.
the GA for each scenario, leading to results supporting the
The advances of the present work are presented as follows:
positive effect of the 𝑅𝑒𝑑𝐸𝑥𝑝 operator with a combined mean
in Section 2 the technical details of the stages of the 𝑅𝑒𝑑𝐸𝑥𝑝
best error of 4.9%. Then, an extended assessment with 10
operator are presented; then, in Section 3 the results obtained
executions or trials of the GA was performed to evaluate its
on TSP instances are presented and discussed; finally in
statistical significance.
Section 4 the conclusions and future work are discussed.
Based on these results, the proposed operator represents
a suitable alternative to improve performance of GAs or
2. Structure of the Reduction-Expansion similar metaheuristics that depend on initial solutions. Also,
Operator (RedExp) it can be a suitable alternative to improve performance when
compared to approaches focused on modifying reproduc-
The 𝑅𝑒𝑑𝐸𝑥𝑝 operator is similar to the clustering strategy tion operators [3]. The details of the 𝑅𝑒𝑑𝐸𝑥𝑝 operator are
presented in [2] where the k-Means Clustering algorithm was described in the present section.
considered to generate the initial population for a GA. In
[2] 𝑁 nodes were clustered into 𝐾 groups based on 𝐾 = 2.1. Clustering Stage. The first stage in the reduction-
√𝑁 + 0.5 in order to solve a TSP with smaller number expansion process consists on determining the set of loca-
of nodes. Thus, finding a route of minimum distance was tions to be reduced. This is accomplished by the clustering
performed considering the cluster centers, and once the process that is described in Pseudocode 1. For this process,
route of minimum distance was obtained, the clusters were an acceptance distance threshold 𝑑𝑐 is defined which is
“disconnected” and “rewired” to assemble a route considering computed as
the original 𝑁 nodes. On a selection of 14 symmetric TSP 𝑑𝑐 = 𝑑𝑚𝑖𝑛 + 𝐾𝜎, (1)
Mathematical Problems in Engineering 3

180 180

160 160

140 140

120 120

100 100

80 80

60 60

40 40

20 20

0 0
0 50 100 150 200 250 300 0 50 100 150 200 250 300
(a) (b)
180 180

160 160

140 140

120 120

100 100

80 80

60 60

40 40

20 20

0 0
0 50 100 150 200 250 300 0 50 100 150 200 250 300
(c) (d)
180

160

140

120

100

80

60

40

20

0
0 50 100 150 200 250 300
(e)

Figure 2: TSP instance a280.tsp [9]: (a) complete set of nodes, (b) optimal total route for the complete set, (c) clustering of the complete set,
(d) optimal total route for the reduced set, and (e) declustered total route for the complete set of nodes.

where where 𝑑(𝑖, 𝑗) is the distance between locations 𝑖 and 𝑗,


(a) 𝑑𝑚𝑖𝑛 is the minimum distance between all locations 𝑖 and 𝑗 = 1,. . ., 𝑁, and 𝑖 ≠ 𝑗.
which is computed as (b) 𝜎 is the standard deviation of the distances between
𝑑𝑚𝑖𝑛 = min 𝑑 (𝑖, 𝑗) , (2) all locations which is computed as
4 Mathematical Problems in Engineering

𝑈 = {𝑖 = 1, . . . , 𝑁} % All nodes
𝑅 = ⌀% Set of coded nodes
𝑥𝑖 = 𝑥-coordinate of node i
𝑦𝑖 = 𝑦-coordinate of node i
𝑑𝑚𝑖𝑛 = min 𝑑 (𝑖, 𝑗) for all 𝑖, 𝑗 and 𝑖 ≠ 𝑗 }
} 𝑑𝑐 = 𝑑𝑚𝑖𝑛 + 𝐾𝜎, where 𝐾 = rand(0.0, 1.0)
𝜎 = √𝑉 (𝑑 (𝑖, 𝑗)) for all 𝑖, 𝑗 and 𝑖 ≠ 𝑗
}
𝑟 = 1 % index for coded nodes
for 𝑖 = 1:N
for 𝑗 = 𝑖+ 1: 𝑁
if 𝑑(𝑖, 𝑗) ≤ 𝑑𝑐
coded nodes(r,1) = 𝑖 % node i is stored for clustering
coded nodes(r,2) = 𝑗 % node j is stored for clustering
coded nodes(r,3) = 𝑐𝑥𝑟 % x-coordinate of equivalent coded node for (i,j)
coded nodes(r,4) = 𝑐𝑦𝑟 % y-coordinate of equivalent coded node for (i,j)
U = U\{𝑖, 𝑗} % U is updated (i and j are removed from U)
R = 𝑅 ∪ {𝑟} % R is updated (equivalent coded node added to R)
r=r+1
end
end
end
% Now, U contains the remaining nodes i that were not clustered or coded. These are added
to R as follows:
for 𝑖= 1: |𝑈|
coded nodes(r,1) = 𝑖 % node i is stored for assignment of new index r
coded nodes(r,2) = - % there is no node j for non-clustered nodes
coded nodes(r,3) = 𝑥𝑖 %𝑥-coodinate of node 𝑖
coded nodes(r,4) = 𝑦𝑖 %𝑦-coodinate of node 𝑖
r=r+1
end

Pseudocode 1: Pseudo-code of the clustering and coding processes.

𝜎 = √𝑉 (𝑑 (𝑖, 𝑗)), (3) suitable option is (4,6) because node “6” is closer to node “4”
than to nodes “20” or “12”.
where 𝑉(𝑑(𝑖, 𝑗)) is the variance of the distances
between locations 𝑑(𝑖, 𝑗), where 𝑖 and 𝑗 = 1,. . ., 𝑁, and 2.2. Coding Stage. This stage consists on coding the clustered
𝑖 ≠ 𝑗. pair of nodes (𝑖, 𝑗) as a single equivalent node 𝑟 with mean
coordinates (𝑐𝑥𝑟 , 𝑐𝑦𝑟 ) estimated as
(c) 𝐾 is the reduction factor which is computed as
(𝑥𝑖 + 𝑥𝑗 )
𝐾 ∈ rand (0.0, 1.0) . (4) 𝑐𝑥𝑟 = ,
2
(5)
It is important to mention that (1) 𝑑𝑚𝑖𝑛 and 𝜎 exclude the (𝑦𝑖 + 𝑦𝑗 )
distance between a location and itself (this leads to a distance 𝑐𝑦𝑟 = ,
2
equal to zero) and (2) equation (4) is computed each time
that an individual is generated (hence, a different acceptance where 𝑟 is the index for the (new) reduced node. It is
distance threshold 𝑑𝑐 is computed to generate each individual important to mention that under this process, pairs of nodes
in the initial population). separated by distances larger than 𝑑𝑐 are not clustered and
In this way, the acceptance threshold metric 𝑑𝑐 ensures remain unchanged. This also happens with candidate nodes
that only locations or nodes that are closer than 𝑑𝑐 are that were released from clustering due to not meeting the
considered as candidates for clustering. Also, in order to avoid criterion of minimum distance. In such cases, the indexes of
significant variability between the original and reduced sets, a these nonclustered nodes are reassigned in terms of the new
criterion of minimum distance was defined for the clustering index 𝑟. Figure 3 presents an example of the clustering and
candidates. This can be explained with the following example: coding processes for a problem with 𝑁 = 7 locations.
consider that pairs (4,6), (6,20), and (12,6) comply with the As presented, the array 𝑐𝑜𝑑𝑒𝑑 𝑛𝑜𝑑𝑒𝑠 contains the registry
restriction of 𝑑𝑐 and the distances between the nodes of each of “equivalencies” for 𝑈 󳨀→ 𝑅. Thus, the coded nodes
pair are 100, 150, and 120, respectively. In this case there in 𝑐𝑜𝑑𝑒𝑑 𝑛𝑜𝑑𝑒𝑠 represent the reduced nodes from 𝑈 (the
are three clustering options for node “6”; however, the most original nodes). This registry is important for the decoding
Mathematical Problems in Engineering 5

Original Nodes Coded Nodes


i=2 i=4 U = {1, 2, 3, 4, 5, 6, 7} R = {1, 2, 3, 4, 5}
r=1 r=2
i=5 i xi yi r i j cxr cyr
i=3
dmin 1 x1 y1 1 2 3 cx1 cy1
i=7 2 x2 y2 r=5 2 4 5 cx2 cy2
i=6 3 x3 y3 r=4 3 1 - x1 y1
4 x4 y4 4 6 - x6 y6
5 x5 y5 5 7 - x7 y7
6 x6 y6 coded_nodes array
i=1 7 x7 y7 r=3
dt = dmin +K

Figure 3: Example of the clustering and coding processes with 𝑁 = 7 nodes.

𝑝 = current node;
Initialization: 𝑝 = 1
𝑅 = 𝑅 \ {𝑝}
route min cost = [p] % TSP route of minimum cost starts at node 1
Sequencing:
while 𝑅 ≠ ⌀
closest node = node in R with the minimum distance to p. If more than one node comply with this
requirement, then it is randomly selected from the complying set of nodes.
𝑅 = 𝑅 \ {𝑐𝑙𝑜𝑠𝑒𝑠𝑡 𝑛𝑜𝑑𝑒};
route min cost = [route min cost closest mode]; % closest node is inserted at the right side of
current route min cost
p=closest node; % current node is updated with the closest node
end
route min cost = [route min cost 1]; %the TSP route ends at node 1

Pseudocode 2: Pseudo-code of the sequencing process (nearest neighbor strategy).

and declustering processes for 𝑅 󳨀→ 𝑈. Note that this process It is important to mention that for clustered nodes, this
ensures that close pairs of nodes (within a distance 𝑑𝑐 ) of process implies two decoding alternatives because an equiva-
minimum distance are kept together. lent node 𝑟 can be decoded as (𝑖, 𝑗) or (𝑗, 𝑖). Because decoding
is sequentially performed left-to-right from 𝑟𝑜𝑢𝑡𝑒 𝑚𝑖𝑛 𝑐𝑜𝑠𝑡,
2.3. Sequencing Stage: Nearest Neighbor Strategy. Most of the the decoding decision for clustered nodes is performed by
routing problems consider an initial and a final node to define computing the effect of both alternatives on the cumulative
a particular route which consists of a sequence of nodes. This cost of the partially decoded (expanded) route.
sequence can lead to a route defined as 𝑟𝑜𝑢𝑡𝑒 𝑚𝑖𝑛 𝑐𝑜𝑠𝑡 of
minimum traveling cost (i.e., distance) throughout all nodes. 3. Assessment
For the sequencing process of all nodes in 𝑅 it is important
to identify the initial and/or final node of the route. This node 3.1. Integration with Genetic Algorithm. For assessment of the
depends of the routing problem itself and it is commonly 𝑅𝑒𝑑𝐸𝑥𝑝 operator on the performance of the GA the following
identified as node 0 or node 1. Then, sequencing is performed scenarios were considered for the generation of the initial
as described in Pseudocode 2. As presented, sequencing population (in all cases the initial population consisted of 500
is performed with a simple heuristic based on the nearest individuals):
neighbor strategy which is expected to make the operator
time-efficient and also to add random flexibility to achieve (a) 𝑅𝑝 : all individuals are generated by random permuta-
a feasible (not optimal) route of minimum cost. tions (𝑅𝑝 operator) as considered by the GA presented
in [7, 11].
2.4. Decoding Stage. Because the route generated by the (b) 𝑆ℎ : all individuals are generated by a sequencing
heuristic described in Pseudocode 2 consists of elements from heuristic of random permutations based on the near-
the reduced set of nodes in 𝑅, it is required to represent this est neighbor strategy (𝑆ℎ operator) as described in
route in terms of the original set of nodes in 𝑈. This expansion Section 2.3.
from 𝑅 to 𝑈 is performed by representing each unique node (c) 𝑅𝑒𝑑𝐸𝑥𝑝: all individuals are generated by the 𝑅𝑒𝑑𝐸𝑥𝑝
𝑟 as the equivalent nodes (𝑖, 𝑗) from 𝑐𝑜𝑑𝑒𝑑 𝑛𝑜𝑑𝑒𝑠. operator.
6 Mathematical Problems in Engineering

Start
Rp
Sℎ
Initial Population Fitness Evaluation
RedExp
(X Individuals → Parents)
Rp _Sℎ
Rp _RedExp Population Update with
Fitness Evaluation Best X Individuals
Sℎ _RedExp
(Parents + Offsprings)

Ascending Sort
Ascending Sort

main_best_cost best_cost_updated_population

Selection of Individuals IF best_cost_updated_population < main_best_cost


for Reproduction no_best_cost = 0
(Roulette Wheel) main_best_cost ← best_cost_updated_population
- Parents - ELSE
no_best_cost = no_best_cost+1
Reproduction of Selected END
Individuals
Crossover (Position-Based,
Order-Based)
Mutation (Inversion, STOP Yes
Exchange) condition is End
- Offsprings - met?

No

Parameter Value
Crossover Probability (J= ) 0.50
Mutation Probability (JG ) 0.20
Crossover Offspring J= X
Mutation Offspring JG X
X 500

Figure 4: General structure of the GA.

(d) 𝑅𝑝 𝑆ℎ : 50% of all individuals are generated by the 𝑅𝑝 (b) HNN [10]: in this work, the initial population of the
operator and the other 50% are generated with the 𝑆ℎ GA was generated by a Hopfield Neural Network
operator. (HNN) and the hybrid algorithm was tested with two
(e) 𝑅𝑝 𝑅𝑒𝑑𝐸𝑥𝑝: 50% of all individuals are generated by small TSP instances with 51 and 76 nodes.
the 𝑅𝑝 operator and the other 50% are generated with Implementation of the GA code was performed with
the 𝑅e𝑑𝐸𝑥𝑝 operator. Octave [16] and MATLAB in a HP Z230 Workstation with
(f) 𝑆ℎ 𝑅𝑒𝑑𝐸𝑥𝑝: 50% of all individuals are generated by Intel Zeon CPU at 3.40 GHz with 8 GB RAM. All executions
the 𝑆ℎ operator and the other 50% are generated with of the GA started with the same random generator with its
the 𝑅𝑒𝑑𝐸𝑥𝑝 operator. seed set at Infinite (𝐼𝑛𝑓).

As mentioned in Section 2.1 the acceptance threshold 3.2. Results on Main Set of 41 TSP Instances. The main test was
metric 𝑑𝑐 is reestimated each time that a solution is generated. performed with 41 TSP instances which were selected from
Thus, due to (1), different degrees of “reduction” can be per- the TSPLIB95 [9], National TSP, and VLSI TSP [17] libraries
formed during the process of generating an initial population to evaluate the statistical significance of the 𝑅𝑒𝑑𝐸𝑥𝑝 operator
with the 𝑅𝑒𝑑𝐸𝑥𝑝 operator. on the GA’s convergence. Error from optimal solutions was
Then, the initial population was integrated into the stan- computed by using the following equation [12]:
dard GA which is presented in Figure 4. The selection of the
crossover and mutation operators which are also presented in 𝑎V𝑒𝑟𝑎𝑔𝑒 − 𝑜𝑝𝑡𝑖𝑚𝑎𝑙
𝐸= . (6)
Figure 4 was based on the findings reported in [7, 12–15]. 𝑜𝑝𝑡𝑖𝑚𝑎𝑙
Finally, comparison was performed with other works
that have performed initial population strategies. Hence, the Initial assessment with these instances was performed
following works were considered for comparison purposes: with a single execution of the GA and a dynamic stop
condition. This was performed to establish an intensive
(a) KMC [2]: in this work, the initial population of the search process. The dynamic stop condition was applied on
GA was generated by using the k-Means Clustering the no best cost variable of the main GA (see Figure 4). This
(KMC) algorithm. The algorithm was tested with 14 variable increases, while no best solution is found within the
TSP instances with 𝑁 = [52 − 442] nodes (mean = search process, and it is set to zero when a new best solution is
204 nodes). found. In this case, the GA iterates, while no best cost > 1000.
Mathematical Problems in Engineering 7

Because only one execution of the GA was considered, the statistically smaller than those obtained with 𝑅𝑝 , 𝑆ℎ , 𝑅𝑝 𝑆ℎ ,
𝑎V𝑒𝑟𝑎𝑔𝑒 result in (6) is the best solution obtained with a single and 𝑅𝑝 𝑅𝑒𝑑𝐸𝑥𝑝. For 𝑅𝑝 𝑅𝑒𝑑𝐸𝑥𝑝 the mean error is only
execution of the GA. Table 1 presents the results of the GA and statistically smaller than the mean errors of 𝑅𝑝 , 𝑆ℎ , and 𝑅𝑝 𝑆ℎ .
the estimated error when compared with optimal results for Hence, this information provides evidence that convergence
each assessment scenario. of a GA can be improved if the initial population is generated
As presented in Table 1 the minimum mean best errors with the 𝑅𝑒𝑑𝐸𝑥𝑝 operator alone or in conjunction with 𝑆ℎ
(5.5%, 6.3%, and 5.7%) were obtained with initial popula- and 𝑅𝑝 .
tions generated with 𝑅𝑒𝑑𝐸𝑥𝑝, 𝑅𝑝 𝑅𝑒𝑑𝐸𝑥𝑝, and 𝑆ℎ 𝑅𝑒𝑑𝐸𝑥𝑝,
respectively. Hence, 𝑅𝑒𝑑𝐸𝑥𝑝 as a single operator, or as a 3.3. Comparison with KMC. The “improved GA” developed
complement to 𝑅𝑝 and 𝑆ℎ operators, has a positive effect in [2] which was used to evaluate the KMC strategy consid-
on the final solution obtained by the GA. By selecting the ered only three mutation operators (flip, swap, and slide) and
minimum error achieved for each instance (throughout all no crossover was performed. In this case, strictly speaking,
scenarios) a total mean best error of 4.9% is computed. the GA presented in [2] does not include all the elements of a
It is important to mention that these results consider GA. In contrast, our GA which is presented in Figure 4 more
the same size of the population through all generations closely resembles the “simple GA” which was reviewed in [2]
of the GA which was set at 𝑁𝐺𝐴 = 500 individuals or as it considers crossover and mutation operators.
solutions. Hence, particularly for instances of size larger Other differences are the following:
than 500 supply/demand nodes, achieving solutions with
significant reductions in TSP distances with the 𝑅𝑒𝑑𝐸𝑥𝑝 (a) Population size and stop conditions: in [2], according
operator supports its feasibility to improve convergence of to the description of the “improved GA” and the
the GA. As presented in Figure 5, if the GA is adapted to examples that were discussed, the population size was
run only for 1000 generations (fixed stop condition) the faster set to 3000 and the number of iterations was set
convergences to minimum distance values are achieved if the to 20000. In our GA the population is smaller (500
𝑅𝑒𝑑𝐸𝑥𝑝 operator is used for the initial population. individuals) and the number of iterations is not fixed.
An extended assessment of the 𝑅𝑒𝑑𝐸𝑥𝑝 operator on the (b) Construction of the initial population: in [2] once that
largest TSP instances (with more than 250 nodes) was per- the complete set of nodes is clustered into 𝐾 groups,
formed with 10 executions or trials of the GA (as performed the GA is used to obtain the local optimal path of
in [2]) and a fixed stop condition (run for 500 generations). each group and a global optimal path of 𝐾 groups.
This was performed to assess the statistical significance of the Then, according to the global optimal path, one edge
results obtained with the 𝑅𝑒𝑑𝐸𝑥𝑝 operator. Table 2 presents of each local optimal path disconnects to rewire the
the average, best, and worst errors obtained for each of the front and back groups. This process is repeated in
considered instances. order to generate the initial population. In our GA,
The results presented in Table 2 corroborate those pre- as described in Section 2, the local optimal path of
sented in Table 1 and Figure 5. The worst average and best clustered and nonclustered nodes is performed by the
error rates are observed if the initial population of the GA nearest neighbor heuristic described in Section 2.3.
is generated with the 𝑅𝑝 operator. These are significantly Then, declustering is performed by the decoding algo-
improved if the initial population incorporates better solu- rithm described in Section 2.4. Thus, our GA is only
tions obtained by the 𝑅𝑒𝑑𝐸𝑥𝑝 or 𝑆ℎ operators. When compar- executed after the initial population is built which,
ing the error rates between 𝑆ℎ and 𝑅𝑝 𝑅𝑒𝑑𝐸𝑥𝑝, 𝑆ℎ 𝑅𝑒𝑑𝐸𝑥𝑝, after the decoding stage, considers the complete set
and 𝑅𝑒𝑑𝐸𝑥𝑝, it is observed that the minimum error rates are of nodes (the GA is not executed with an initial
obtained with the 𝑅𝑒𝑑𝐸𝑥𝑝 operator. To quantitatively assess population consisting of clustered nodes).
this difference a statistical significance test was performed on
the errors reported in Table 2. Due to these differences, and others associated with
For this purpose, a paired t-test was performed with the the hardware resources used for implementation, strict fair
following null hypothesis: comparison is difficult to be performed. Nevertheless, a
close comparison was performed by restricting our GA to
𝐻0 : 𝜇𝐴 − 𝜇𝐵 < 0, (7) be executed up to the average execution time of the GA
presented in [2] which is very competitive. Table 4 presents
where 𝐴 and 𝐵 are the two scenarios to be compared, and the
the results on the TSP instances considered by [2].
hypothesis is focused on rejecting or validating that the mean
As presented in Table 4 the GA with the 𝑅𝑒𝑑𝐸𝑥𝑝
error of 𝐴 is smaller than the mean error of 𝐵. In contrast, the
operator, when executed during the same average time as
alternative hypothesis is defined as
the KMC approach, can achieve a smaller mean best error
𝐻1 : 𝜇𝐴 − 𝜇𝐵 ≥ 0. (8) (4.6078% ≤ 6.9763%). Although for small instances the KMC
achieved very small errors (i.e., for berlin52 and kroA100),
Table 3 presents the results of the significance test with for larger instances with more than 150 nodes the GA
a 𝑝-value of 0.10 for all scenarios. As presented, the mean with the 𝑅𝑒𝑑𝐸𝑥𝑝 operator can achieve smaller errors than
errors obtained with 𝑆ℎ and 𝑅𝑝 𝑆ℎ are statistically smaller those obtained by the KMC approach. These results must
than the mean error obtained with 𝑅𝑝 . In contrast, the be considered with caution due to the differences previously
mean errors obtained with 𝑅𝑒𝑑𝐸𝑥𝑝 and 𝑆ℎ 𝑅𝑒𝑑𝐸𝑥𝑝 are discussed.
8
Table 1: Results on main set of 41 TSP instances: errors on assessment scenarios.
TSP TSP Distance Values of Solutions Obtained with the Genetic Algorithm (GA) Error Values E=[(GA-optimal)/optimal]×100% Minimum
Instance Size Optimal Value
Library Rp Sh RedExp Rp Sh Rp RedExp Sh RedExp Rp Sh RedExp Rp Sh Rp RedExp Sh RedExp Error
TSPLIB95 a280 280 2579.0 3063.3 2736.0 2709.9 2769.9 2910.3 2711.6 18.8 6.1 5.1 7.4 12.8 5.1 5.1
berlin52 52 7542.0 8105.2 7850.8 7841.4 8000.8 7959.6 7850.8 7.5 4.1 4.0 6.1 5.5 4.1 4.0
bier127 127 118282.0 126507.8 123003.7 121705.6 123034.1 122001.9 121312.1 7.0 4.0 2.9 4.0 3.1 2.6 2.6
ch150 150 6528.0 7268.4 6842.4 6708.3 6746.0 6786.2 6783.0 11.3 4.8 2.8 3.3 4.0 3.9 2.8
d198 198 15780.0 16996.7 17019.6 16086.1 17111.6 16227.2 16990.3 7.7 7.9 1.9 8.4 2.8 7.7 1.9
d493 493 35002.0 38900.8 37190.3 36609.2 36807.4 36581.6 36749.3 11.1 6.3 4.6 5.2 4.5 5.0 4.5
d657 657 48912.0 55426.9 52830.9 51372.5 52469.4 52094.2 52395.7 13.3 8.0 5.0 7.3 6.5 7.1 5.0
dsj1000 1000 18659688.0 20991643.2 20760980.1 20454938.6 20923295.9 20545095.0 20156541.3 12.5 11.3 9.6 12.1 10.1 8.0 8.0
eil51 51 426.0 454.4 437.4 435.4 438.3 437.4 439.7 6.7 2.7 2.2 2.9 2.7 3.2 2.2
eil76 76 538.0 582.7 591.2 556.7 571.6 580.2 564.7 8.3 9.9 3.5 6.2 7.8 5.0 3.5
fl417 417 11861.0 12764.2 13015.7 12874.6 12796.3 12809.8 12651.8 7.6 9.7 8.5 7.9 8.0 6.7 6.7
gil262 262 2378.0 2616.3 2589.0 2539.3 2527.5 2493.4 2503.0 10.0 8.9 6.8 6.3 4.9 5.3 4.9
kroB100 100 22141.0 24484.7 22437.5 22896.9 22496.7 22971.3 22399.5 10.6 1.3 3.4 1.6 3.8 1.2 1.2
kroC100 100 20749.0 22221.5 22144.8 22093.7 21913.6 22531.1 21883.5 7.1 6.7 6.5 5.6 8.6 5.5 5.5
kroD100 100 21294.0 23996.3 23030.8 22298.0 22719.1 22255.6 21903.7 12.7 8.2 4.7 6.7 4.5 2.9 2.9
kroA150 150 26524.0 28772.8 29162.2 27208.4 28820.5 29100.3 27748.1 8.5 9.9 2.6 8.7 9.7 4.6 2.6
kroB150 150 26130.0 29195.1 27739.2 27178.0 27593.8 27412.4 27464.8 11.7 6.2 4.0 5.6 4.9 5.1 4.0
kroA200 200 29368.0 32293.2 29968.2 29790.0 29920.6 30537.1 30035.5 10.0 2.0 1.4 1.9 4.0 2.3 1.4
kroB200 200 29437.0 32004.6 31324.0 31348.9 32099.6 31124.8 30900.6 8.7 6.4 6.5 9.0 5.7 5.0 5.0
lin105 105 14379.0 15108.6 14930.6 14633.7 15029.9 14643.5 14578.2 5.1 3.8 1.8 4.5 1.8 1.4 1.4
lin318 318 42029.0 47687.1 44509.1 43973.7 46097.5 45226.4 44504.0 13.5 5.9 4.6 9.7 7.6 5.9 4.6
nrw1379 1379 56638.0 64651.8 61695.0 61192.2 61373.3 61211.2 61340.3 14.1 8.9 8.0 8.4 8.1 8.3 8.0
pcb1173 1173 56892.0 67114.3 62582.5 62004.2 62532.9 61903.0 61824.0 18.0 10.0 9.0 9.9 8.8 8.7 8.7
pr107 107 44303.0 49850.8 45179.0 45123.2 45512.6 45318.3 45033.2 12.5 2.0 1.9 2.7 2.3 1.6 1.6
pr124 124 59030.0 61408.4 61554.7 60595.2 61329.9 61281.3 60300.7 4.0 4.3 2.7 3.9 3.8 2.2 2.2
pr136 136 96772.0 103518.4 105707.1 104396.2 103833.5 103417.9 104568.2 7.0 9.2 7.9 7.3 6.9 8.1 6.9
pr264 264 49135.0 54992.3 53304.4 52560.5 52336.5 51673.2 53139.5 11.9 8.5 7.0 6.5 5.2 8.1 5.2
pr439 439 107217.0 121892.8 114895.1 113016.7 116812.5 116505.1 115850.4 13.7 7.2 5.4 8.9 8.7 8.1 5.4
pr1002 1002 259045.0 296309.0 282091.6 280868.3 280048.9 278721.5 276725.7 14.4 8.9 8.4 8.1 7.6 6.8 6.8
rat195 195 2323.0 2638.2 2424.7 2405.9 2448.1 2409.4 2393.4 13.6 4.4 3.6 5.4 3.7 3.0 3.0
rat575 575 6773.0 7606.9 7315.9 7178.1 7255.8 7264.9 7316.1 12.3 8.0 6.0 7.1 7.3 8.0 6.0
rat783 783 8806.0 10285.2 9644.8 9510.9 9547.1 9507.9 9457.4 16.8 9.5 8.0 8.4 8.0 7.4 7.4
st70 70 675.0 739.3 688.1 682.6 688.1 678.1 678.4 9.5 1.9 1.1 1.9 0.5 0.5 0.5
u724 724 41910.0 47136.2 44735.9 44448.6 44490.1 44644.2 44892.4 12.5 6.7 6.1 6.2 6.5 7.1 6.1
u1432 1432 152970.0 175842.4 166679.9 167534.6 165992.8 164462.7 166085.7 15.0 9.0 9.5 8.5 7.5 8.6 7.5
VLSITSP dca1389 1389 5085.0 6125.2 5591.0 5670.7 5557.8 5608.9 5521.5 20.5 10.0 11.5 9.3 10.3 8.6 8.6
VLSITSP dka1376 1376 4666.0 5467.6 5138.2 5121.5 5146.0 5124.6 5143.6 17.2 10.1 9.8 10.3 9.8 10.2 9.8
VLSITSP pbm436 436 1443.0 1694.7 1536.2 1539.1 1547.3 1527.9 1561.0 17.4 6.5 6.7 7.2 5.9 8.2 5.9
NTSP LU980 980 11340.0 12843.2 12180.7 12245.9 12281.0 12042.7 12146.6 13.3 7.4 8.0 8.3 6.2 7.1 6.2
NTSP UY734 734 79114.0 91652.3 85269.0 84715.4 85487.5 86227.1 85230.5 15.8 7.8 7.1 8.1 9.0 7.7 7.1
NTSP ZI929 929 95345.0 107284.6 103768.2 102227.8 103170.9 102125.1 101745.0 12.5 8.8 7.2 8.2 7.1 6.7 6.7
Average Errors = 11.7 6.9 5.5 6.7 6.3 5.7 4.9
Mathematical Problems in Engineering
Table 2: Results on the 20 largest TSP instances: average, best, and worst results on 10 runs of the GA with 500 generations.
Instance Rp Sh RedExp Rp Sh Rp RedExp Sh RedExp
Mathematical Problems in Engineering

Average Best Worst Average Best Worst Average Best Worst Average Best Worst Average Best Worst Average Best Worst
a280 219.82% 210.47% 237.19% 8.56% 7.07% 11.29% 7.23% 3.83% 9.33% 8.22% 6.04% 10.29% 8.85% 4.95% 11.97% 8.88% 6.90% 10.19%
d493 353.42% 338.64% 362.27% 10.09% 9.17% 10.95% 9.96% 8.71% 11.07% 9.27% 8.52% 11.01% 10.22% 9.44% 10.92% 9.31% 8.19% 10.67%
d657 648.07% 637.43% 659.50% 16.58% 15.87% 17.39% 12.80% 11.77% 13.45% 17.25% 15.32% 18.91% 13.42% 11.84% 14.31% 13.06% 11.61% 14.13%
dsj1000 1241.81% 1213.86% 1269.45% 23.97% 22.73% 24.52% 16.14% 14.31% 17.27% 22.59% 19.79% 23.69% 15.87% 13.96% 16.93% 16.66% 15.32% 17.59%
fl417 606.45% 567.84% 655.64% 10.46% 7.00% 11.69% 11.36% 6.93% 13.97% 10.87% 6.69% 14.17% 9.65% 8.16% 13.17% 8.61% 3.99% 12.20%
lin318 264.31% 256.94% 286.52% 11.41% 9.71% 13.01% 6.42% 5.36% 7.59% 11.80% 10.52% 14.28% 7.53% 5.00% 9.57% 7.72% 7.02% 8.86%
nrw1379 1261.95% 1248.51% 1277.91% 19.51% 18.28% 20.42% 18.38% 17.79% 19.19% 20.19% 19.11% 20.72% 18.74% 18.02% 19.50% 19.01% 18.11% 19.89%
pcb1173 1195.06% 1167.86% 1211.13% 19.44% 18.67% 20.39% 18.09% 17.14% 18.78% 19.53% 18.42% 20.04% 18.84% 17.22% 20.10% 18.78% 17.23% 21.06%
pr439 445.22% 427.80% 451.11% 13.01% 11.05% 15.80% 8.91% 7.08% 10.30% 11.96% 9.74% 14.23% 10.44% 8.35% 12.75% 10.02% 8.81% 11.38%
pr1002 1102.60% 1095.34% 1106.83% 15.32% 14.51% 16.16% 14.87% 14.22% 15.66% 15.62% 15.02% 16.79% 15.31% 15.01% 15.88% 14.99% 13.83% 15.86%
rat575 523.40% 509.95% 531.09% 14.74% 13.46% 16.12% 14.28% 13.47% 15.04% 15.25% 13.40% 16.40% 14.58% 13.20% 15.94% 14.26% 11.91% 15.34%
rat783 781.88% 770.09% 792.67% 16.47% 15.50% 17.56% 15.62% 14.98% 16.22% 16.74% 16.12% 17.48% 15.82% 14.35% 17.45% 16.19% 15.67% 17.02%
u724 764.02% 752.12% 772.07% 16.71% 15.00% 18.26% 14.85% 13.25% 16.05% 17.08% 15.28% 18.37% 15.19% 13.73% 16.86% 14.77% 13.27% 15.65%
u1432 1323.86% 1297.30% 1338.61% 20.07% 19.53% 21.11% 20.05% 18.92% 20.96% 20.17% 19.22% 21.59% 19.96% 19.16% 20.73% 19.79% 18.56% 21.14%
uy734 803.96% 774.71% 822.22% 16.91% 14.73% 17.99% 15.61% 14.92% 16.12% 16.40% 13.64% 17.69% 16.27% 14.97% 17.41% 15.51% 14.73% 16.24%
pbm436 447.86% 444.54% 450.02% 11.22% 9.77% 12.46% 10.66% 8.03% 12.20% 10.82% 9.42% 13.47% 10.86% 8.87% 12.48% 10.46% 8.65% 11.48%
dka1376 1569.52% 1546.66% 1579.51% 18.32% 17.41% 18.96% 18.77% 17.28% 20.14% 19.67% 18.30% 20.38% 18.83% 18.05% 19.59% 18.56% 17.56% 19.36%
zi929 945.79% 919.41% 971.55% 16.96% 15.92% 17.58% 13.76% 13.15% 14.64% 16.63% 14.98% 18.43% 13.70% 12.85% 14.65% 13.53% 12.13% 15.12%
lu980 1228.25% 1218.35% 1249.95% 18.60% 17.54% 19.69% 17.91% 14.52% 19.70% 18.02% 15.59% 19.58% 18.48% 16.96% 19.70% 18.75% 17.40% 20.87%
dca1389 1671.61% 1660.84% 1684.23% 17.87% 17.26% 18.32% 17.66% 16.77% 18.62% 18.00% 16.73% 18.93% 17.84% 16.78% 18.93% 17.86% 16.77% 19.61%
Average Error= 869.94% 852.93% 885.47% 15.81% 14.51% 16.98% 14.17% 12.62% 15.32% 15.80% 14.09% 17.32% 14.52% 13.04% 15.94% 14.34% 12.88% 15.68%
9
10 Mathematical Problems in Engineering

16000000

14000000

12000000

10000000

8000000

6000000

4000000

690000

670000

650000
Average TSP Distance

630000

610000

590000

570000

550000
1 51 101 151 201 251 301 351 401 451 501 551 601 651 701 751 801 851 901 951 1001
Generations
Av. Rp Av. Rp_Sh
Av. Sh Av. Rp_RedExp
Av. RedExp Av. Sh_RedExp

Figure 5: Results on main set of 41 TSP instances: average convergence of the GA.

Table 3: Results on the 20 largest TSP instances: statistical significance test.

B
Rp Sh RedExp Rp Sh Rp RedExp Sh RedExp
Rp Reject Reject Reject Reject Reject
Sh Accept Reject Reject Reject Reject
RedExp Accept Accept Accept Accept Reject
A
Rp Sh Accept Reject Reject Reject Reject
Rp RedExp Accept Accept Reject Accept Reject
Sh RedExp Accept Accept Reject Accept Accept
Mathematical Problems in Engineering 11

Table 4: Results on set of 14 TSP instances: comparison with KMC [2] (same average execution time).

REDEXP KMC
Instance Size Optimal Value RedExp Error Rp RedExp Error Sh RedExp Error Best Error Average Time (s) Best Error
berlin52 52 7542 7903.8 4.7967 7985.8 5.8841 7841.4 3.9696 3.9696 16.3208 0.0000
kroA100 100 21282 22445.1 5.4649 22445.1 5.4649 22381.9 5.1683 5.1683 20.3318 0.1035
pr144 144 58537 61399.2 4.8896 61399.2 4.8896 61243.8 4.6241 4.6241 26.5335 2.0176
ch150 150 6528 6707.4 2.7477 6784.3 3.9264 6798.6 4.1450 2.7477 25.4257 5.1038
kroB150 150 26130 27439.0 5.0095 26787.4 2.5158 27084.9 3.6543 2.5158 25.5628 3.3544
pr152 152 73682 77133.3 4.6841 77082.1 4.6145 75852.8 2.9461 2.9461 26.4204 2.5998
rat195 195 2323 2419.8 4.1675 2411.1 3.7936 2411.8 3.8231 3.7936 30.0717 8.7133
d198 198 15780 16253.2 2.9985 16270.4 3.1075 16504.8 4.5930 2.9985 32.5780 2.8149
kroA200 200 29368 30026.6 2.2427 30290.5 3.1413 30303.1 3.1840 2.2427 31.8691 5.3725
ts225 225 126643 138526.9 9.3837 139043.0 9.7913 130054.6 2.6938 2.6938 34.1776 10.3008
pr226 226 80369 82685.6 2.8825 82113.5 2.1706 83026.1 3.3061 2.1706 36.3008 4.9741
pr299 299 48191 51740.7 7.3658 53164.5 10.3205 51731.9 7.3476 7.3476 44.6964 14.6584
lin318 318 42029 46135.6 9.7710 46507.3 10.6552 45821.8 9.0243 9.0243 47.2462 15.0669
pcb442 442 50778 57006.4 12.2660 57917.6 14.0604 57032.5 12.3173 12.2660 62.8187 22.5887
204 Average = 5.6193 6.0240 5.0569 4.6078 32.8824 6.9763

Table 5: Results on set of 2 TSP instances: comparison with HNN [10] (same iterations).

REDEXP HNN
Instance Size Optimal Value RedExp Error Rp RedExp Error Sh RedExp Error Best Error Best Best Error
eil51 51 426 437.4 2.6781 435.5 2.2318 439.7 3.2095 2.2318 429 0.7042
eil76 76 538 579.8 7.7785 571.9 6.2973 578.6 7.5446 6.2973 549 2.0446

3.4. Comparison with HNN. In [10] a Hopfield Neural Net- Genetic Algorithms (GA) for the TSP. The application of this
work (HNN) was considered for the creation of the initial operator was focused on improving the initial population of
population of a GA. The GA had the standard structure that the GA as performed by other works such as [2, 10].
was considered by our GA although with different reproduc- While the operator is based on clustering as in [2], only
tion operators as it considered heuristic crossover and muta- pairs of the closest nodes were considered for clustering,
tion operators. Also, it considered a small population with 50 and the number of clusters was dynamically defined by an
individuals and 100 iterations for the GA. Testing in [10] was acceptance threshold which considers the distance variation
performed with only two instances (eil51 and eil76). Table 5 between all nodes in the network. Experiments performed
presents the results reported by [10] and those obtained by with a set of 41 well-known symmetric TSP instances led
the proposed GA with the 𝑅𝑒𝑑𝐸𝑥𝑝 operator. For consistency to corroborate the suitability of the operator to improve the
purposes our GA was executed during 100 iterations. convergence and the quality of the final solutions obtained by
As presented in Table 5 the HNN approach achieved a a GA by obtaining a mean best error of 4.9%.
smaller error than our GA with the 𝑅𝑒𝑑𝐸𝑥𝑝 operator. This Extended assessment was performed with the 20 largest
is consistent with the significant differences observed for instances of this set and it was observed that, within 500
instances berlin52, kroA100, and pr144 in Table 4. In this case, generations of the GA, the 𝑅𝑒𝑑𝐸𝑥𝑝 operator can improve the
it is important to observe that best performance of the GA performance when compared to 𝑅𝑝 and 𝑆ℎ operators.
with the 𝑅𝑒𝑑𝐸𝑥𝑝 operator is observed in large instances (i.e., When compared with other strategies focused on the
more than 150 nodes) and not in small instances. initial population of the GA, it was observed that the
While the HNN approach presents a very small error proposed approach presents significant errors when tested
when compared to the GA with the 𝑅𝑒𝑑𝐸𝑥𝑝 operator, the on small instances with less than 150 nodes. However, this
use of the HNN may be restricted to the size of the instance. may be caused by the clustering process itself. As discussed
As stated in [10] the Hopfield scheme requires 𝑁2 neurons in Section 2 the distribution patterns of the nodes may affect
for a 𝑁-node problem. This will be further discussed in the the performance of the clustering and declustering processes
following section. by increasing variability in the initial population. In this
work, additional evidence about the number of nodes was
4. Discussion and Future Work also found. Particularly, for small instances, the distribution
patterns are more representative of its key features. Hence,
In this work, a reduction-expansion operator, termed as clustering can more severely affect the integrity of the
𝑅𝑒𝑑𝐸𝑥𝑝, was developed to improve the performance of key features, even if clustering is small. This can provide
12 Mathematical Problems in Engineering

important insights regarding other logistic problems and [4] V. Zharfi and A. Mirzazadeh, “A novel metaheuristic for
solving methods based on clustering. travelling salesman problem,” Journal of Industrial Engineering,
Another aspect that must be studied is the effect of vol. 2013, Article ID 347825, 5 pages, 2013.
the 𝑅𝑒𝑑𝐸𝑥𝑝 operator (and in general, of the clustering and [5] M. Anantathanavit and M. Munlin, “Using K-means radius par-
nearest neighbor approaches) on the genetic diversity of the ticle swarm optimization for the travelling salesman problem,”
initial population. This is because, if the initial population is IETE Technical Review, vol. 33, no. 2, pp. 172–180, 2016.
initialized with very good solutions (obtained by a nearest [6] A. Mohsen, “Annealing Ant Colony Optimization with Muta-
neighbor heuristic or by a deterministic method such as the tion Operator for Solving TSP,” Computational Intelligence and
Clarke and Wright (C&W) algorithm), many solutions are Neuroscience, vol. 2016, Article ID 8932896, 13 pages, 2016.
likely to share the same subsequences of genes. This may [7] P. Larrañaga, C. M. H. Kuijpers, R. H. Murga, I. Inza, and
affect the diversification performance of the reproduction S. Dizdarevic, “Genetic algorithms for the travelling salesman
operators, leading to converge to local optima. problem: a review of representations and operators,” Artificial
Intelligence Review, vol. 13, no. 2, pp. 129–170, 1999.
Thus, future work is focused on extending on the limita-
tions of the 𝑅𝑒𝑑𝐸𝑥𝑝 operator. The following are considered [8] V. Togan and A. T. Daloglu, “An improved genetic algorithm
with initial population strategy and self-adaptive member
as research topics:
grouping,” Computers & Structures, vol. 86, pp. 1204–1218, 2008.
(a) Adaptation of the Scheme Theorem to determine [9] G. Reinelt, TSPLIB 95, Universität Heidelberg, Institut für
the subsequences of genes which are common to all Informatik, Heidelberg, Germany, 2016, http://comopt.ifi.uni-
solutions within the initial solutions of the considered heidelberg.de/software/TSPLIB95/.
scenarios: in this way, the effect of these improvement [10] G. Vahdati, S. Y. Ghouchani, and M. Yaghoobi, “A hybrid search
strategies on the genetic diversity of the initial popu- algorithm with Hopfield neural network and genetic algorithm
lations could be preliminarily assessed. for solving traveling salesman problem,” in Proceedings of the
2nd International Conference on Computer and Automation
(b) Developing more efficient metrics for the acceptance Engineering, ICCAE 2010, pp. 435–439, Singapore, February
threshold 𝑑𝑐 because it has a direct effect on the 2010.
clustering stage: this is focused on finding better [11] P. Chen, “An improved genetic algorithm for solving the
solutions for large instances and optimal solutions for Traveling Salesman Problem,” in Proceedings of the 2013 Ninth
smaller instances. International Conference on Natural Computation (ICNC), pp.
(c) Integrating the HNN approach within the clustering 397–401, 2013.
process to analyze the performance on large instances. [12] S. S. Ray, S. Bandyopadhyay, and S. K. Pal, “Genetic operators
for combinatorial optimization in TSP and microarray gene
(d) To extend on the use for other routing problems as the ordering,” Applied Intelligence, vol. 26, no. 3, pp. 183–195, 2007.
Capacitated Vehicle Routing Problem (CVRP). [13] O. Abdoun, J. Abouchabaka, and C. Tajani, “Analyzing the
(e) To develop a metric to assess the loss of features when performance of mutation operators to solve the travelling
applying clustering. salesman problem,” International Journal of Emerging Sciences,
vol. 2, no. 1, pp. 61–77, 2012.
[14] G. Ücoluk, “Genetic algorithm solution of the tsp avoiding
Data Availability special crossover and mutation,” Intelligent Automation & Soft
Computing, vol. 8, no. 3, pp. 1–9, 2013.
The databases which were used are publicly available in the
[15] K. Puljić and R. Manger, “Comparison of eight evolutionary
Internet. Reference URL was provided in the manuscript.
crossover operators for the vehicle routing problem,” Mathe-
matical Communications, vol. 18, no. 2, pp. 359–375, 2013.
Conflicts of Interest [16] J. Eaton, “GNU Octave 4.2.1,” 2017, https://www.gnu.org/
software/octave/.
The authors declare that there are no conflicts of interest [17] W. Cook and A. Rohe, National Travelling Salesman Problems,
regarding the publication of this paper. VLSI Data Sets, University of Waterloo (Canada) and Uni-
versität Bonn (Germany), 2016, http://www.math.uwaterloo.ca/
References tsp/world/countries.html.

[1] M. Gendreau, G. Ghiani, and E. Guerriero, “Time-dependent


routing problems: A review,” Computers & Operations Research,
vol. 64, pp. 189–197, 2015.
[2] Y. Deng, Y. Liu, and D. Zhou, “An improved genetic algorithm
with initial population strategy for symmetric TSP,” Mathemati-
cal Problems in Engineering, vol. 2015, Article ID 212794, 6 pages,
2015.
[3] A. Hussain, Y. Shad-Muhammad, M. Nauman-Sajid, and I.
Hussain, “Genetic Algorithm for Traveling Salesman Problem
with Modified Cycle Crossover Operator,” Computational Intel-
ligence and Neuroscience, vol. 2017, Article ID 7430125, 7 pages,
2017.
Advances in Advances in Journal of The Scientific Journal of
Operations Research
Hindawi
Decision Sciences
Hindawi
Applied Mathematics
Hindawi
World Journal
Hindawi Publishing Corporation
Probability and Statistics
Hindawi
www.hindawi.com Volume 2018 www.hindawi.com Volume 2018 www.hindawi.com Volume 2018 http://www.hindawi.com
www.hindawi.com Volume 2018
2013 www.hindawi.com Volume 2018

International
Journal of
Mathematics and
Mathematical
Sciences

Journal of

Hindawi
Optimization
Hindawi
www.hindawi.com Volume 2018 www.hindawi.com Volume 2018

Submit your manuscripts at


www.hindawi.com

International Journal of
Engineering International Journal of
Mathematics
Hindawi
Analysis
Hindawi
www.hindawi.com Volume 2018 www.hindawi.com Volume 2018

Journal of Advances in Mathematical Problems International Journal of Discrete Dynamics in


Complex Analysis
Hindawi
Numerical Analysis
Hindawi
in Engineering
Hindawi
Differential Equations
Hindawi
Nature and Society
Hindawi
www.hindawi.com Volume 2018 www.hindawi.com Volume 2018 www.hindawi.com Volume 2018 www.hindawi.com Volume 2018 www.hindawi.com Volume 2018

International Journal of Journal of Journal of Abstract and Advances in


Stochastic Analysis
Hindawi
Mathematics
Hindawi
Function Spaces
Hindawi
Applied Analysis
Hindawi
Mathematical Physics
Hindawi
www.hindawi.com Volume 2018 www.hindawi.com Volume 2018 www.hindawi.com Volume 2018 www.hindawi.com Volume 2018 www.hindawi.com Volume 2018

Вам также может понравиться