Towards a Better Understanding of Genetic Operators for Ordering
Optimization: Application to the Capacitated Vehicle Routing Problem
S. Ben Hamida
1
, R. Gorsane
2
and K. Gorsan Mestiri
2
1
LAMSADE CNRS UMR 7243, Paris Dauphine University, PSL Research University, France
2
InstaDeep, Tunisia
Keywords:
Genetic Algorithms, Ordering Optimization, CVRP, Hybridization, Exploitation/Exploration.
Abstract:
Genetic Algorithms (GA) have long been used for ordering optimization problems with some considerable
efforts to improve their exploration and exploitation abilities. A great number of GA implementations have
been proposed varying from GAs applying simple or advanced variation operators to hybrid GAs combined
with different heuristics. In this work, we propose a short review of genetic operators for ordering optimization
with a classification according to the information used in the reproduction step. Crossover operators could be
position (”blind”) operators or heuristic operators. Mutation operators could be applied randomly or using
local optimization. After studying the contribution of each class on solving two benchmark instances of the
Capacitated Vehicle Routing Problem (CVRP), we explain how to combine the variation operators to allow
simultaneously a better exploration of the search space with higher exploitation. We then propose the random
and the balanced hybridization of the operators’ classes. The hybridization strategies are applied to solve
24 CVRP benchmark instances. Results are analyzed and compared to demonstrate the role of each class of
operators in the evolution process.
1 INTRODUCTION
Ordering Optimization is a sub-field of combinatorial
optimization consisting of searching for an arrange-
ment solution of an ordering problem such as order-
ing tasks in a scheduling problem or ordering cities
for the traveling salesman problem. Solutions are of-
ten encoded with an order based scheme using a rela-
tive order of the symbols in the genotype. A solution
is then a sequence of n ordered objects. A genotype
might be a string of numbers that represent a position
(number) in the sequence. Given n unique objects,
n! permutations of the objects exist, which make the
problem NP-Complete and hard to solve. The interest
in using GA for ordering application has been going
on since the 1990s. Throughout these decades, much
effort has been invested to propose powerful variation
operators as well as hybrid methods replacing opera-
tors with heuristic or meta-heuristic techniques. The
present study focus on genetic operators for ordering
optimization.
We propose in this paper a short review of vari-
ation operators for ordering optimization with GA.
We then propose to classify these operators accord-
ing to the information from the parent chromosomes
implied in the reproduction step. Thus, for crossover
operators, we distinguish between position operators,
using only the position of the genes (symbols) in the
chromosomes, and advanced or heuristic operators
using additional information such as the neighbor-
hood and the performance of the parents. Similarly,
we classify mutation operators into two categories:
Position mutation applying a random variation, and
local mutation applying local optimization on the mu-
tated solution.
Each class of operators is applied to solve some
selected benchmark instances of the Capacitated Ve-
hicle Routing Problem (CVRP). The performance of
each class is analyzed according to the distance of the
best solution to the optimum and the population con-
vergence speed. We then study the combination of
the different operators’ classes according to the explo-
ration/exploitation ability of each class. Otherwise, to
handle constraints of the CVRP application, we pro-
pose a simple procedure based on a segregational rank
that is introduced in the selection operators. This pro-
cedure gives advantage randomly to feasible or unfea-
sible solutions to preserve diversity in the population.
The paper is organized as follows. Section 2 re-
views the different categories of variation operators
Ben Hamida, S., Gorsane, R. and Mestiri, K.
Towards a Better Understanding of Genetic Operators for Ordering Optimization: Application to the Capacitated Vehicle Routing Problem.
DOI: 10.5220/0009832704610469
In Proceedings of the 15th International Conference on Software Technologies (ICSOFT 2020), pages 461-469
ISBN: 978-989-758-443-5
Copyright
c
2020 by SCITEPRESS Science and Technology Publications, Lda. All rights reserved
461
for order based optimization and explains the oper-
ational mode of the operators implemented for the
experimental study. Section 3 formulate the CVRP
problem, introduce the segregational selection for
handling constraints, and summarize the general loop
of the implemented GA and its parameters. The dif-
ferent classes of operators are studied in the section 4
through the first series of tests. Then, the hybridiza-
tion strategies and their corresponding experimental
study are detailed in section 5.
2 REVIEW AND
CLASSIFICATION OF THE
VARIATION OPERATORS
Standard order-based variation operators for GAs are
essentially ”blind” operators acting regardless of the
fitness of the solutions to cross or mutate. For order-
ing optimization, ”blind” crossover operators are in-
spired from standard binary crossovers, such as the
uniform crossover (Syswerda, 1989) and the order-
based crossover (OX) (Davis, 1985; M. et al., 1987;
Deep and Adane, 2011) described below. We design
this class of operators as ”Position Crossovers” since
they need only the sequences of symbols and their
positions to be applied. Similarly, several mutation
operators for ordering optimization change randomly
one or several symbols in different positions in the
chromosome. They are designed as ”Position Muta-
tion” operators. Four operators in this category are
described and illustrated in section 2.3.
To increase the GA exploitation ability, a great
number of advanced and heuristic crossover operators
have been proposed. An advanced crossover operator
considers an individual as the origin of its search and
tries to produce a better solution. They are designed in
this paper as ”Heuristic Crossovers”. They use some
heuristic information from the parents to hopefully
produce better offspring. The heuristic information
could be the adjacency describing gene neighbors in
the parents, the objective function, the neighborhood
map, distance between genes, etc. Three crossovers
are selected from this class for the experimental study:
the Distance Preserving crossover (DPX) (Freisleben
and Merz, 1996), the Alternating Edges Crossover
(AEX) (Grefenstette et al., 1985) and the Edge Re-
combination crossover (ERX) (Whitley et al., 1989).
As for Heuristic crossover, advanced mutation op-
erators use a local search algorithm that starts from a
solution and ends up in a local minimum where no
further improvement is possible. It aims at intensify-
ing the search by exploiting search paths determined
by the neighborhood of the corresponding solution
(Neri et al., 2012). The classical local search λ-opt
is studied in this work with λ=2 or 3. For further
intensification of the search around the best area, a
stochastic Hill-climbing technique is implemented as
a local search operator.
The following subsection describes the variation
operators implemented for the experimental study.
The set of these operators and their corresponding
classes are summarized in table 1.
Table 1: Variation operators’ classes.
Operator Class Selected Operators
Position Crossover (PosCross): OX1, OX4, PMX
Heuristic Crossover (HeurCross): DPX, AEX, ERX
Position Mutation (PosMut): ISM, SWM, SHM, DM
Local Mutation (LocalMut): λ-Opt(λ=2/3), Hill-Climbing
2.1 Position or ”Blind” Crossovers
Position or ”blind” crossovers recombine the genetic
material of the parents into a new configuration with-
out considering their initial performance in the pur-
pose to explore new research directions. Thus, they
can be seen as explorative more than exploitative op-
erators. The oldest operator in this category is the
uniform order-based crossover (Syswerda, 1989) that
generates a random binary template to decide from
which parent the gene is selected at each position.
Several operators were proposed in the same cate-
gory which main purpose is to produce feasible solu-
tions without considering their performance, such as
the order based crossover and the partially matched
crossover implemented for this work.
The Order-based Crossover (OX): OX, proposed
by Davis (Davis, 1985), is a variation of the uniform
crossover introduced in the purpose of preserving the
relative order of symbols in the sequences to be com-
bined. The first implementation of the OX operator
(OX
1
) generates two cut points for both parents. The
symbols between the two cut points are copied to the
children. Then, starting from the second cut point
of one parent, the symbols from the other parent are
copied to one offspring in the same order, omitting
those which already exist.
Deep and Mebrahtu (Deep and Adane, 2011) pro-
posed three other variations of OX. In the first variant,
the cut points in the two parents are at different posi-
tions, but the size of the substring between the cut
points is the same for both parents. The symbols are
copied to the offspring respecting the same rules de-
fined for the original OX. This variant might also be
applied with different substring sizes (called OX
4
).
Partially Matched (or Mapped) Crossover (PMX):
The PMX crossover is a well-known operator pro-
posed by Goldberg and Lingle (Goldberg and Lin-
ICSOFT 2020 - 15th International Conference on Software Technologies
462
gle, 1985). It has two tasks: cross two parents and
prevent the repetition of symbols in the offspring. It
operates in two main steps: swap and map. The
first step is quite similar to the OX. Two crossover
points are selected randomly. The sub-strings be-
tween these two points are exchanged to create two
incomplete offspring. In the second step, each off-
spring is completed using the remaining elements in
the corresponding parent. If the element is already
present, then a correction is performed according to
the mapping defined with the matching section.
Other Position Crossovers: Several other crossover
operators for ordering optimization could be clas-
sified as Position crossovers, such as the Max-
imal Preservative operator (PMX) (M
¨
uhlenbein
et al., 1988), the voting recombination crossover
(M
¨
uhlenbein, 1989) and the Cycle Crossover (M.
et al., 1987). A description of these operators could
be found in (P
´
etrowski and Ben-Hamida, 2017).
Otherwise, some variants for most of the posi-
tion operators were proposed either for comparative
purposes or to be adjusted to the handled ordering
problem. For example, the Non-Wrapping Order
Crossover (Cicirello, 2006) is a variant of OX and
the Extend PMX (Tao, 2008) is a variant of PMX. A
short review with a comparative sutdy could be found
in (Karakati
ˇ
c and Podgorelec, 2015).
2.2 Heuristic Crossovers
Distance Preserving Crossover (DPX): DPX
(Freisleben and Merz, 1996) generates an offspring
close to its parents by preserving some common
sub-strings. It starts by detecting common sub-paths
of two parents using the Hamming distance. Then, it
uses a greedy method to reconnect them and produce
a child.
Alternating Edges Crossover (AEX): AEX (Grefen-
stette et al., 1985) considers a genotype as a directed
cycle of arcs, where an arc is defined by two succes-
sive symbols. To create a child from two parents, arcs
are chosen in alternation from the two parents, with
some additional random choices in case of unfeasibil-
ity. AEX starts by choosing the arc (g
1
g
2
) from
the first parent. Next, the arc having as first gene g
2
in
the second parent is selected which the second gene
is added to the child. AEX proceeds in the same way
until the child is complete.
Genetic Edge Recombination Crossover (ERX):
ERX (Whitley et al., 1989) considers a genotype as an
edge and it is based on the gene adjacency. The aim is
to keep as many neighbor as possible in the offspring.
ERX starts by building the ”edge-map” that gives for
each vertex the set of edges that start or finish on it.
To produce an offspring, ERX starts from a ran-
dom symbol designed as current symbol x
c
. It finds
the element in the neighbor set of x
c
having the lower
neighbor set cardinality. This element is added to the
child, removed from the ”edge-map” and becomes the
new current element. The process continues until all
entries are visited.
Other Heuristic Crossovers: Grefenstette proposed
in (Grefenstette, 1987) a set of Heuristic crossover for
the TSP: HGreX, HRndX, HpPoX. As AEX, HGreX
considers the genotype as a directed cycle of arcs. The
Inver-Over operator (Tao and Michalewicz, 1998) can
be classified in the same category as it considers a
solution as a directed graph. It invert some substring
from a given parent in the hope to improve the fitness.
2.3 Position or ”Blind” Mutation
Random or ”blind” mutation is widely used for com-
binatorial order-based optimization for explorative
purpose and to maintain the population diversity
(P
´
etrowski and Ben-Hamida, 2017). Four operators
are selected and implemented for this work: exchange
mutation or swap mutation (SWM) (M. et al., 1987),
displacement mutation (Michalewicz, 1992) (DM),
shifting mutation (SHM) (Fogel, 1988) and insertion
mutation (ISM). SWM selects two random positions
along the parent string and exchange the correspond-
ing symbols. SHM chooses randomly two positions
on the parent genotype. Then, the symbol in the first
position is shifted from its current position to the sec-
ond position. DM displaces a randomly selected sub-
sequence in the parent genotype.
Note that several other mutation operators having
the same goal have been proposed for ordering op-
timization such as the scramble operator (Syswerda,
1991) that selects a random sub-sequence in the par-
ent genotype and scrambles its symbols, or the in-
version mutation operator (Fogel, 1988) that inverts a
randomly selected sub-string in the parent genotype.
2.4 Local Mutation
The idea of introducing a local search technique in the
GA engine for combinatorial problems is supported
by several researchers since many decades ago. Sev-
eral works in the literature demonstrated that such hy-
bridization improves the exploitation of best solutions
and might speed convergence to the optimal or to a
near-optimal solution (Neri et al., 2012). We present
below two simple techniques to implement a local
search in the mutation step, the λ-Opt local search and
the stochastic Hill-climbing.
Towards a Better Understanding of Genetic Operators for Ordering Optimization: Application to the Capacitated Vehicle Routing Problem
463
λ-Opt Mutation: λ-Opt is a simple local search al-
gorithm. The main idea behind it is to take a route
that crosses over itself and reorder it so that it does
not. It works as follows. λ edges are removed from
the tour of the corresponding solution. The remain-
ing segments are reconnected in all possible ways. If
any improvement is identified, then the corresponding
movements are applied to the solution.
Stochastic Hill-Climbing: The simplest way to per-
form a local search in a combinatorial space is to per-
turb a trial solution S and to replace it by the new solu-
tion if it performs better. This procedure is known as
Hill-Climbing (HC). It tries to explore the neighbor-
hood of a given solution and to move it to the eventu-
ally best direction. HC can be applied using the ran-
dom genetic mutation operators to generate solutions
in the neighborhood of the trial solution. However, for
an ordering problem, the neighborhood is very large
and a full exploration is impossible. The stochastic
Hill-Climbing is widely applied in this case. It picks
a single random neighbor and accepts it if it is better
than S. The algorithm terminates upon a computa-
tional limit.
3 APPLICATION TO THE
CAPACITATED VEHICLE
ROUTING PROBLEM
Ordering optimization problems in the real world are
generally submitted to a set of constraints. For this
reason, we chose for our study the Capacitated Vehi-
cle Routing Problem (CVRP) described in the follow-
ing subsection. The rest of the section describes the
GA implemented for the experimental study.
3.1 CVRP Formulation
The vehicle routing problem (VRP) is an important
problem class in the field of operations research and
transport logistics optimization. Its original formula-
tion has been defined over 60 years ago by Dantzig
and Ramser(Dantzig and Ramser, 1959) and consists
of a fleet of identical vehicles serving a set of cus-
tomers with a certain demand from a single depot
and having a certain capacity. This formulation is re-
ferred to as the Capacitated Vehicle Routing Problem
(CVRP).
The mathematical formulation of the CVRP can
be defined as follows: let G = (N,A) be an undirected
graph, where N is the set of nodes N =
{
0,1,··· ,n
}
and E =
{
(i, j) : i, j N,i 6= j
}
is the set of edges
joining the nodes. Node 0 refers to the depot from
which the vehicles start from and comeback to. The
other nodes represent the customers having each a
known non-negative demand q
i
for customer i. A set
of K vehicles N =
{
V
1
,V
2
,··· ,V
k
}
having each a max-
imum capacity Q is provided. The travel distance be-
tween node i and j is defined by d
i j
> 0.
The CVRP can be stated as follows:
min
N
i=0
N
j=0
K
k=1
d
i j
X
k
i, j
(1)
Where
X
k
i, j
=
1 if V
k
travels from node i to node j
0 otherwise
(2)
Subject to:
K
k=1
N
i=0
X
k
i, j
= 1, j
{
1···N
}
,i 6= j (3)
K
k=1
N
j=0
X
k
i, j
= 1,i
{
1···N
}
,i 6= j (4)
N
i=0
N
j=0
X
k
i, j
R
i
Q
k
,k
{
1···K
}
(5)
N
j=1
X
k
0, j
N
j=1
X
k
j,0
= 0,k
{
1···K
}
(6)
K
k=1
X
k
0, j
K (7)
The objective function defined in (1) minimizes the
total travelled distances by all the vehicles. Constraint
sets (3) and (4) guarantee that each customer is visited
only once. Constraint set (5) ensures that the total re-
quest of the customers being served by a vehicle does
not exceed the vehicle capacity. Constraint set (6)
guarantees that all routes start and finish at the depot,
and constraint (7) requires that there are a maximum
of K routes for serving the customers.
To encode a VRP solution, several representations
are possible: vector, matrix, graph, etc (P
´
etrowski
and Ben-Hamida, 2017). The most common encoding
technique is the intuitive path-encoding in which each
vehicle route is designed by a sequence of numbers
corresponding to the requests indexes in the request
vector R. In the present work, to encode a CVRP
solution S, two strings of positive numbers are used:
S = (P,L), where P designs the global path of the so-
lution defined by the connection of the different ve-
hicle routes without including the depot, and L is a
string vector designing the routes lengths of all the
vehicles.
ICSOFT 2020 - 15th International Conference on Software Technologies
464
3.2 Handling Constraints
To handle constraints without implementing costly
tools or adding new parameters, we propose the segre-
gational selection for the parental selection. The main
idea of the segregational selection is to sort the pop-
ulation (or a set of individuals for tournament selec-
tion) either according first to fitness (total distance for
CVRP) and then violation rate (capacity overflow for
CVRP), or according first to violation rate and then
total distance. Segregational selection aims to bal-
ance between best feasible solution and unfeasible so-
lutions having the lowest total distance.
3.3 The Algorithm
For the experimental study, the implemented gener-
ational genetic loop respects the Steady State GA
(SSGA). Indeed, with the SSGA, at each iteration,
only two parents are selected to generate a new off-
spring that replaces the worst offspring in the popula-
tion. In our case, the new offspring is generated with
a reproduction procedure using the different variation
operators described in section 2.
Algorithm 1: SSGA.
1: Initialize population with µ feasible solutions
2: Evaluate initial population
3: while non STOP condition do
4: select two parents (P
1
, P
2
) using the segrega-
tional tournament selection
5: C Reproduction(P
i
, P
j
)
6: set W the worst solution according to either fit-
ness or violation rate
7: Replace W with C in the parent population
8: end while
3.4 Benchmark Data and Parameters
Setting
SSGA with the variation operators is implemented in
Python language. Its performance is then validated
on some selected benchmark instances from the set of
CVRPs proposed by (Augerat et al., 1995) available
on the ”VRP Web” site
1
. The set A contains 27 in-
stances with a number of requests n varying from 32
to 80 and a number of vehicles k varying from 5 to
10. The vehicle capacity is set to 100 for all the prob-
lems. For a comparison purpose, the optimal or the
best-known solution (BKS) is given with the SSGA
results. A set of preliminary tests have been applied
to tune the SSGA parameters and the retained values
1
http://www.bernabe.dorronsoro.es/vrp/
are summarized in table 2. As shown in the table,
some parameters may have different values from one
experimental study to another.
Table 2: Parameters used for SSGA.
Hyperparameter Value
Crossover probability 0.8
Mutation probability 0.4
Population size 200
Tournament size 8
Maximum nb of it 50000 (1
st
study)
300000 (2
d
study)
It without improvement 10000
Runs per Instance 20 (1
st
study), 30 (2
d
study)
4 COMPARING OPERATORS
CLASSES
To study the SSGA behavior when applying one of the
operator classes introduced above, two benchmarks
instances are selected: the first has 34 requests and 5
vehicles (n34-K5) and the second has 39 requests and
5 vehicles (n39-K5). The corresponding BKS are 778
and 822 respectively. The GA parameters applied for
this study are set as given in table 2. A first series of
tests are run using a single operator class at a time. In
the second series of tests, each crossover class is ap-
plied with one operator selected randomly from the
two mutation classes. Four combinations are then
studied: (PosCross+PosMut), (PosCross+LocalMut),
(HeurCross+PosMut), (HeurCross+LocalMut).
To compare the different combinations, we ex-
tract some synthetic information from the obtained
results. First, we compute the minimum and aver-
age relative distance (loss) to the optimum: (%loss =
100 (Best optimum)/optimum). Results are illus-
trated in figure 1. Second, we record the relative con-
vergence speed according to the maximum number of
iterations defined in the parameters’ settings. The aim
is to visualize if the evolution has stopped due to a
stagnation problem or because of the maximum num-
ber of iterations is reached. A high value generally
reflects rapid, often premature convergence. Results
are illustrated in the figure 2. Apart from that, to visu-
alize the exploration/exploitation ability of the opera-
tors classes along the evolution, the best fitness spot
of the best run of each test case is illustrated with a
curve in figure 3.
Figures 1 (a) and (b) show clearly that the per-
formance of each class of operators depends on the
problem to solve. For the n34-K5 case, mutation op-
erators, either in PosMut or LocalMut class, were able
to bring the population near the optimal solution. It
wasn’t the case with n39-k5, which is more difficult
Towards a Better Understanding of Genetic Operators for Ordering Optimization: Application to the Capacitated Vehicle Routing Problem
465
0,0%
10,0%
20,0%
30,0%
Min
Loss
Avg
Loss
Min
Loss
Avg
Loss
Min
Loss
Avg
Loss
Min
Loss
Avg
Loss
PosCross HeurCross PosMut LocalMut
1,8%
6,0%
4,2%
7,9%
0,9%
5,5%
0,9%
5,5%
(a) n34-K5 instance
0,0%
10,0%
20,0%
30,0%
Min
Loss
Avg
Loss
Min
Loss
Avg
Loss
Min
Loss
Avg
Loss
Min
Loss
Avg
Loss
PosCross HeurCross PosMut LocalMut
9,2%
14,2%
4,5%
26,0%
4,7%
27,4%
6,9%
27,3%
(b) n39-K5 instance
-1,0%
4,0%
9,0%
14,0%
Min
Loss
Avg
Loss
Min
Loss
Avg
Loss
Min
Loss
Avg
Loss
Min
Loss
Avg
Loss
PosC+PosM PosC+LocalMHeurC+PosMHeurC+LocalM
0,9%
5,2%
1,2%
4,6%
0,0%
3,4%
1,8%
4,7%
n34-K5 instance
0,0%
2,0%
4,0%
6,0%
8,0%
10,0%
12,0%
14,0%
Min
Loss
Avg
Loss
Min
Loss
Avg
Loss
Min
Loss
Avg
Loss
Min
Loss
Avg
Loss
PosC+PosMPosC+LocalMHeurC+PosMHeurC+LocalM
1,0%
7,6%
0,5%
9,2%
1,8%
12,1%
0,5%
9,2%
n39-K5 instance
Figure 1: Minimum and average loss according to BKS ob-
tained by each class or combination of classes of operators
for the CVRP instances n34-K5 and n39-K5.
to solve than n34-k5. For the difficult cases, the use
of both crossover and mutation operators is needed
in evolutionary search. Indeed, Figures 1(c) and (d)
show that the deviation to the optimum is greatly
improved with the combined classes. For example,
with the combination (PosCross+LocalMut), the de-
viation of best solution is about only 0.5% to the op-
timum (Fig 1(d)), while it is about 9.2% for PosCross
and 6.2% for LocalMut (Fig 1(b)) when applied with
combination. Similarly, combining operators classes
for the n34-K5 case allows a better exploration of
the search space, especially for the HeurCross class
which its combination with PosMut class has allowed
it to easily find the optimum.
0,0% 20,0% 40,0% 60,0% 80,0%
PosCross
HeurCross
PosMut
LocalMut
PosCross
HeurCross
PosMut
LocalMut
K34-K5 K39-K5
Average convergence speed of the different operator
classes
0% 10% 20% 30% 40% 50% 60% 70%
PosC+PosM
PosC+LocalM
HeurC+PosM
HeurC+LocalM
PosC+PosM
PosC+LocalM
HeurC+PosM
HeurC+LocalM
K34-K5 K39-K5
Average convergence speed of different operator
classes' combinations
Figure 2: Convergence Speed according to the maximum
number of iterations for each class or combination of classes
of operators for the CVRP instance n34-K5 and n39-K5.
Otherwise, it can be noted through the figures 1
and2 that some classes of operators have a great abil-
ity to perform local search around the best solutions.
This great exploitation capacity, applied with a high
probability since the first steps of the evolution, bring
the population around local optima. The SSGA diver-
sification process helps in some cases to explore other
regions but ends up falling back into the trap of pre-
mature convergence. This is the case for HeurCross
class operators.
Other classes of operators have the opposite prob-
lem. Their exploration capacity allows the population
to continuously explore new regions in the research
space and avoid local optima. It is the case of the Pos-
Mut class. However, the convergence is too slow and
does not necessarily lead to the optimum region. This
is due to the lack of effective exploitation of the best
solutions. The question now is how to take advantage
of the strengths of each class and create a perfect bal-
ance between exploitation and exploration along the
evolution.
700
900
1100
1300
1500
1700
0
1500
3000
4500
6000
7500
9000
10500
12000
13500
15000
16500
18000
19500
21000
22500
24000
25500
27000
28500
30000
31500
33000
34500
36000
37500
39000
(a) n34-K5: Evoluon of the best tness
with one operator class
Pos Mut
Local mut
Heur Cross
Pos Cross
800
1000
1200
1400
1600
1800
0
1800
3600
5400
7200
9000
10800
12600
14400
16200
18000
19800
21600
23400
25200
27000
28800
30600
32400
34200
36000
37800
39600
41400
43200
45000
46800
48600
(b) n39-K5: Evoluon of the best tness
with one operator class
Pos Mut
Local mut
Heur Cross
Pos Cross
750
950
1150
1350
1550
1750
0
1800
3600
5400
7200
9000
10800
12600
14400
16200
18000
19800
21600
23400
25200
27000
28800
30600
32400
34200
36000
37800
39600
41400
43200
45000
46800
48600
(d) n39-K5: Evoluon of the best tness
with operator classes combinaon
PosC+LocaM
PosC+PosM
HeurC+PosM
HeurC+LocalM
Figure 3: Evolution of the best fitness of the best run in the
series of tests for solving the cases n34-k5 and n39-k5.
Combining Genetic Algorithms with local search
techniques in the mutation step is widely used and
proved to be efficient in speeding up the evolution
process for ordering optimization. However, when
applied with heuristic crossover, this advantage can
become a difficulty. Indeed, if the two steps of re-
production have as common objective to exploit the
best solutions to produce more competitive ones, the
exploration of the research space is reduced and the
population is attracted to local optima. This is the fa-
mous dilemma between exploitation and exploration
for genetic algorithms. We believe that it is important,
for ordering optimization, to control the simultaneous
use of different classes of operators. Thus, we pro-
pose and test in the following section two strategies
to allow a simultaneous use of all classes of opera-
tors. The second strategy introduces in addition a re-
strictive combination between classes to control the
population convergence.
ICSOFT 2020 - 15th International Conference on Software Technologies
466
5 COMBINING VARIATION
OPERATORS
5.1 Hybridization Strategies
As concluded in the first comparative study, the ex-
ploitation/exploration abilities of the variation oper-
ators differ from a class to another. Each opera-
tor can help in the evolution of the population to-
wards better areas in the search space but in a dif-
ferent way. We then make the first hypothesis: ”Us-
ing variation operators from different classes improve
the GA efficiency for ordering optimization. How-
ever, heuristic crossover when applied with local mu-
tation can lead to a premature convergence. To pre-
vent this phenomenon, we propose to combine heuris-
tic crossover with random mutation, and similarly, po-
sition crossover with local mutation. The objective is
that random operators control the convergence speed
of heuristic/local operators and vise versa . Thus, we
make the second hypothesis: ”An operator class with
high exploitation ability could be controlled with an
operator class with a high exploration ability”.
To test the two hypotheses, we propose below two
hybridization strategies. The first one is the random
hybridization where the crossover and mutation oper-
ators are chosen randomly from the set of all the op-
erators’ classes. The second strategy is the balancing
hybridization which is a conditional hybridization. If
one class of operators is used for crossover, it will not
be used again for mutation. Note that the operator
within the given class is chosen randomly.
5.1.1 Random Hybridization (RH)
Random hybridization is quite equivalent to the multi-
operator mechanism proposed in (Pulji
´
c and Manger,
2013) with an additional condition on the set of vari-
ation operators which must includes some operators
from each class. To produce an offspring with the RH
strategy, a crossover operator is selected randomly
from the PosCross or HeurCross classes. The pro-
duced offspring is then mutated using a random op-
erator from PosMut or LocalMut classes (Algo. 2).
There is no additional restriction on the choice of op-
erators either for crossover or mutation.
Algorithm 2: RH Reproduction(P
1
, P
2
).
1: select randomly opX in PosCross HeurCross
2: With a probability p
c
, apply opX to (P
1
, P
2
) and
get the new child C
3: SELECT randomly opM in LocalMut PosMut
4: With a probability p
m
, apply opM to C
5.1.2 Balancing Hybridization (BH)
Balancing hybridization aims to balance the GA ex-
ploration and exploitation abilities by mixing heuris-
tic operators with position operators. Thus, if an
exploitative offspring is generated with an heuristic
crossover, it can be mutated only with a position mu-
tation operator to add an exploratory dimension. Sim-
ilarly, if an exploratory child is generated with a po-
sition crossover, it can be mutated only with a lo-
cal search based mutation for an exploitative purpose
(Algo. 3).
Algorithm 3: BH Reproduction(P
1
, P
2
).
1: select randomly opX in PosCross HeurCross
2: With a probability p
c
, apply opX to (P
1
, P
2
) and
get the new child C
3: if opX PosCross then
4: SELECT opM from LocalMut class
5: else
6: SELECT opM from PosMut class
7: end if
8: With a probability p
m
, apply opM to C
5.2 Results and Discussion
For each series of 30 runs, we record: the cost of the
best obtained solution (Best), the average cost over
the thirty runs (Average) and the percentage of the
relative deviation (loss) from the best-known solution
(BKS): %Loss = 100 (Best BKS)/BKS (%Devia-
tion). Results for the second experimental study are
given in table 3. The relative deviation values (loss)
are also illustrated in figure4 to make easier the com-
parison between the RH and BH procedures.
The different values recorded for each bench-
mark instance demonstrate that the SSGA efficiency
is greatly improved with the hybridization strategies,
either random or balanced hybridization. Indeed, the
deviation or loss to the optimum is less than 1% for
11 instances with the ”RH reproduction” and for 10
instances with the ”BH Reproduction” as illustrated
in figure 4. These results confirm our first hypothesis
that the application of the multi-operator mechanism
including operators from the four classes enhances the
efficiency of the Genetic Algorithm. However, except
for some cases, ”BH Reproduction” was not able to
outperform the ”RH Reproduction” as we supposed in
the second hypothesis. As illustrated in figure 4, with
the RH Reproduction”, the GA is able to get closer
to the optimum than with ”BH Reproduction”, espe-
cially for the cases with a number of requests greater
than 42 (except for the n65-k5 case). Through these
Towards a Better Understanding of Genetic Operators for Ordering Optimization: Application to the Capacitated Vehicle Routing Problem
467
Table 3: Results with CVRP A.
Random Hybridization Balancing Hybridization
Instance BKS Best Average % Deviation Best Average % Deviation
A-n32-k5.vrp 784 784 806 0 784 803 0
A-n33-k5.vrp 661 661 689 0 661 694 0
A-n33-k6.vrp 742 742 757 0 743 761 0.13
A-n34-k5.vrp 778 778 797 0 778 800 0
A-n36-k5.vrp 799 805 831 0.75 814 840 1.88
A-n37-k5.vrp 669 670 706 0.15 670 703 0.15
A-n37-k6.vrp 949 953 1009 0.42 953 1036 0.42
A-n38-k5.vrp 730 731 763 0.14 733 759 0.41
A-n39-k5.vrp 822 834 863 1.46 825 864 0.36
A-n39-k6.vrp 831 839 866 0.96 839 875 0.96
A-n44-k7.vrp 937 947 984 1.07 957 981 2.13
A-n45-k7.vrp 1146 1166 1203 1.75 1176 1206 2.62
A-n46-k7.vrp 914 953 986 4.27 923 978 0.98
A-n48-k7.vrp 1073 1115 1164 3.91 1107 1150 3.17
A-n53-k7.vrp 1010 1031 1100 2.08 1038 1189 2.77
A-n54-k7.vrp 1167 1193 1362 2.23 1222 1548 4.71
A-n55-k9.vrp 1073 1075 1168 0.19 1104 1507 2.89
A-n60-k9.vrp 1408 1414 1596 0.43 1405 1619 3.77
A-n63-k10.vrp 1315 1378 1846 4.79 1349 1456 4.57
A-n63-k9.vrp 1634 1685 2441 3.12 1717 2960 5.08
A-n64-k9.vrp 1402 1472 1903 4.99 1459 2136 4.07
A-n65-k9.vrp 1177 1289 2698 9.52 1219 1754 3.37
A-n69-k9.vrp 1168 1186 1491 1.54 1219 1549 4.37
results, we can understand that there is a synergy be-
tween the different classes of operators. So it is use-
ful to combine the different classes without restric-
tion. However, HeurCross and LocMut classes are
helpful to refine the best solutions in the second step
of the evolution but could penalize the search process
at the beginning of the evolution. Thus, we think that
is advantageous to control the application of the dif-
ferent combination between operators classes with a
dynamic probability.
0
2
4
6
8
10
A-n32-k5
A-n33-k5
A-n33-k6
A-n34-k5
A-n36-k5
A-n37-k5
A-n37-k6
A-n38-k5
A-n39-k5
A-n39-k6
A-n44-k7
A-n45-k7
A-n46-k7
A-n48-k7
A-n53-k7
A-n54-k7
A-n55-k9
A-n60-k9
A-n63-k10
A-n63-k9
A-n64-k9
A-n65-k9
A-n69-k9
% deviaon to the BKS
% deviaon of best obtained soluon for each
benchmark case
RH BH
Figure 4: Relative Deviation to the optimum given by the
best solution obtained with RH and BH reproduction.
6 CONCLUSION
This paper proposes a classification of the variation
operators used for evolutionary ordering optimiza-
tion into four classes: Position crossover, Heuris-
tic crossover, Position mutation and Local mutation.
After studying the exploitation/exploration ability of
each class on solving two benchmark instances of
the Capacitated Vehicle Routing Problem, we pro-
posed two hybridization strategies between operators
classes: Random Hybridization and Balancing Hy-
bridization. Results obtained for 24 CVRP bench-
mark instances showed clearly that combining vari-
ation operators from different classes increases the
GA robustness when solving constrained ordering op-
timization problems. However, the balancing hy-
bridization needs further control on the application of
the different classes combinations. Future works aim
to introduce this control with a dynamic or adaptive
probability and apply the implemented GA for solv-
ing other CVRP benchmark series.
REFERENCES
Augerat, P., Belenguer, J. M., Benavent, E., Corber
´
an, A.,
Naddef, D., and Rinaldi, G. (1995). Computational
results with a branch and cut code for the capacitated
vehicle routing problem. IMAG.
Cicirello, V. A. (2006). Non-wrapping order crossover:
An order preserving crossover operator that respects
absolute position. In Proceedings of the 8th annual
conference on Genetic and evolutionary computation,
pages 1125–1132. ACM.
Dantzig, G. B. and Ramser, J. H. (1959). The truck dis-
patching problem. Management science, 6(1):80–91.
ICSOFT 2020 - 15th International Conference on Software Technologies
468
Davis, L. (1985). Job shop scheduling with genetic algo-
rithms. In Proceedings of the 1st International Con-
ference on Genetic Algorithms, pages 136–140, Hills-
dale, NJ, USA. L. Erlbaum Associates Inc.
Deep, K. and Adane, H. M. (2011). New variations of order
crossover for travelling salesman problem. Interna-
tional Journal of Combinatorial Optimization Prob-
lems and Informatics, 2(1):2.
Fogel, D. B. (1988). An evolutionary approach to the
traveling salesman problem. Biological Cybernetics,
60(2):139–144.
Freisleben, B. and Merz, P. (1996). A genetic local search
algorithm for solving symmetric and asymmetric trav-
eling salesman problems. In Evolutionary Computa-
tion, 1996., Proceedings of IEEE International Con-
ference on, pages 616–621. IEEE.
Goldberg, D. E. and Lingle, R. (1985). Alleles, loci, and
the traveling salesman problem. In Proceedings of
an International Conference on Genetic Algorithms
and Their Applications, volume 154, pages 154–159.
Lawrence Erlbaum, Hillsdale, NJ.
Grefenstette, J., Gopal, R., Rosmaita, B., and Van Gucht,
D. (1985). Genetic algorithms for the traveling sales-
man problem. In Proceedings of the first International
Conference on Genetic Algorithms and their Applica-
tions, pages 160–168. Lawrence Erlbaum, New Jersey
(160-168).
Grefenstette, J. J. (1987). Incorporating problem specific
knowledge into genetic algorithms. Genetic algo-
rithms and simulated annealing, 4:42–60.
Karakati
ˇ
c, S. and Podgorelec, V. (2015). A survey of ge-
netic algorithms for solving multi depot vehicle rout-
ing problem. Applied Soft Computing, 27:519 – 532.
M., I., Smith, D. J., and Holland, J. R. C. (1987). A
study of permutation crossover operators on the trav-
eling salesman problem. In Proceedings of the Second
International Conference on Genetic Algorithms and
Their Application, pages 224–230, Hillsdale, USA.
Michalewicz, Z. (1992). Genetic algorithms+ data struc-
tures= evolution programs. Springer Verlag.
M
¨
uhlenbein, H. (1989). Parallel genetic algorithms, pop-
ulation genetics and combinatorial optimization. In
Workshop on Parallel Processing: Logic, Organiza-
tion, and Technology, pages 398–406. Springer.
M
¨
uhlenbein, H., Gorges-Schleuter, M., and Kr
¨
amer, O.
(1988). Evolution algorithms in combinatorial opti-
mization. Parallel Computing, 7(1):65–85.
Neri, F., Cotta, C., and Moscato, P. (2012). Handbook of
memetic algorithms, volume 379. Springer.
P
´
etrowski, A. and Ben-Hamida, S. (2017). Evolutionary
Algorithms. John Wiley & Sons.
Pulji
´
c, K. and Manger, R. (2013). Comparison of eight evo-
lutionary crossover operators for the vehicle routing
problem. Mathematical Communications, 18(2):359–
375.
Syswerda, G. (1989). Uniform crossover in genetic algo-
rithms. In Proceedings of the 3rd International Con-
ference on Genetic Algorithms, pages 2–9, San Fran-
cisco, CA, USA. Morgan Kaufmann Publishers Inc.
Syswerda, G. (1991). Schedule optimization using genetic
algorithms. In Davis, L., editor, Handbook of Genetic
Algorithms. Van Nostrand Reinhold, New York, NY.
Tao, G. and Michalewicz, Z. (1998). Inver-over operator
for the tsp. In 5th International Conference Parallel
Problem Solving from Nature Amsterdam, The Nether-
lands, pages 803–812. Springer.
Tao, Z. (2008). Tsp problem solution based on improved
genetic algorithm. In Natural Computation, 2008.
ICNC’08. Fourth International Conference on, vol-
ume 1, pages 686–690. IEEE.
Whitley, L. D., Starkweather, T., and Fuquay, D. (1989).
Scheduling problems and traveling salesmen: The ge-
netic edge recombination operator. In ICGA, vol-
ume 89, pages 133–40.
Towards a Better Understanding of Genetic Operators for Ordering Optimization: Application to the Capacitated Vehicle Routing Problem
469