Experimental Study of Hybrid Genetic Algorithms for the Maximum Scatter Travelling Salesman Problem

We consider the maximum scatter travelling salesman problem (MSTSP), a travelling salesman problem (TSP) variant. The problem aims to maximize the shortest edge in the tour that travels each city only once in the given network. It is a very complicated NP-hard problem, and hence, exact solutions are obtainable for small sizes only. For large sizes, heuristic algorithms must be applied, and genetic algorithms (GAs) are observed to be very successful in dealing with such problems. In our study, a simple GA (SGA) and four hybrid GAs (HGAs) are proposed for the MSTSP. The SGA starts with initial population produced by sequential sampling approach that is improved by 2-opt search, and then it is tried to improve gradually the population through a proportionate selection procedure, sequential constructive crossover, and adaptive mutation. A stopping condition of maximum generation is adopted. The hybrid genetic algorithms (HGAs) include a selected local search and perturbation procedure to the proposed SGA. Each HGA uses one of three local search procedures based on insertion, inversion and swap operators directly or randomly. Experimental study has been carried out among the proposed SGA and HGAs by solving some TSPLIB asymmetric and symmetric instances of various sizes. Our computational experience reveals that the suggested HGAs are very good. Finally, our best HGA is compared with a state-of-art algorithm by solving some TSPLIB symmetric instances of many sizes. Our computational experience reveals that our best HGA is better. Keywords—Hybrid genetic algorithm; maximum scatter travelling salesman problem; sequential constructive crossover; adaptive mutation; local search; perturbation procedure


I. INTRODUCTION
The travelling salesman problem (TSP) is a popular problem, which finds smallest tour of the salesman that starts journey from a headquarters city and visits all outstanding n cities (nodes) exactly once before comes back to his headquarters. The TSP is NP-Hard [1] and several good procedures are suggested to solve the problem. However, some circumstances need different constraints to accept a tour as solution. One such constraint is to maximize the shortest edge in the tour, and the TSP having such constraint is called the maximum scatter TSP (MSTSP). So, the MSTSP finds a Hamiltonian cycle/circuit so as to maximize the shortest edge. That means, each city in the Hamiltonian circuit is far from (scattered) its preceding and succeeding cities. The problem is also known as the max-min 1-neighbour TSP. In general, the max-min m-neighbour TSP aims to maximize the shortest edge (distance) between a city and its all m-neighbor cities in the Hamiltonian cycle/circuit. The bottleneck TSP (BTSP) is very close to the MSTSP. The BTSP aims to minimize the longest edge [2]. Further, the maximum TSP (MaxTSP) which finds a Hamiltonian cycle/circuit to maximize the length of any tour is also closely related to the MSTSP [3]. Fig. 1 shows the difference between TSP and MSTSP on an instance of 29 cities [4]. It is clear from the figure that the TSP aims to decrease the total distance covered by the salesman, whereas the MSTSP aims to maximize the shortest edge by producing any two successive cities in its tour as much scattered as possible.
The problem may be converted to the BTSP by supposing cij = M-dij, where C = [cij] is the corresponding BTSP's distance (or cost) matrix and M is a big number [5]. The MSTSP was first defined in [6], which has several applications ( [1], [7]). The MSTSP is NP-hard [1], and no polynomial-time algorithm is available for solving the problem. So, finding optimal solution for large-sized problem instances using exact method is not possible. Thus, for finding better solution, within acceptable computational effort, to such type of problems, generally, heuristic/metaheuristic algorithms are applied. Tabu search [8], simulated annealing [9], ant colony algorithm [10], insertion heuristic [11], variable neighbourhood method [12], discrete differential evolution algorithm [13], genetic algorithms [14], etc., are some popular metaheuristic algorithms. Among them, genetic algorithms (GAs) are widely used algorithms, and so, we are using GAs to solve the MSTSP.
GAs are based on simulating the Darwinian survival-ofthe-fittest theory in the environmental biology [14]. They are very robust, parallel, and global search metaheuristics that can solve large-sized problems quickly. They can automatically obtain and collect knowledge throughout the search procedure *Corresponding Author www.ijacsa.thesai.org and can adaptively manage the search procedure to obtain the optimal/best solution. They were effectively applied to various complex optimization problems for solving them. For any problem, each feasible solution may be encoded as a string called chromosome or individual whose value is its objective function [15]. Chromosomes are collections of genes. Simple GAs start from a chromosome set known as initial population and then go through mainly three basic operatorsselection, crossover, and mutation, to generate improved populations in following generations. Selection operator probabilistically copies some chromosomes to the following generation.
Crossover arbitrarily selects two parent chromosomes and mates them to produce offspring chromosome(s). Mutation picks out a position at a chromosome randomly and changes its value. The crossover along with selection is the major procedure in GAs. Mutation varies the search space and defends genetic material losses. Thus, crossover probability is set to be very high, whereas mutation probability is set to be very low [14]. As crossover operator is very important operator, so, using better crossover operators can achieve better GAs. Normally, crossover methods that were applied for the usual TSP are applied to its variations also. A computational experience carried amongst eight crossover methods for the MSTSP proven that sequential constructive crossover is the best operator [16].
Though simple GAs using three basic operators can solve complex optimization problems quickly, but very often they converge prematurely, and get trapped in local minima [17]. So, one must apply some techniques to overcome premature convergence issue and to enhance the solution obtained by simple GAs. So, this paper develops a simple GA (SGA) and four hybrid GAs (HGA1, HGA2, HGA3 and HGA4) for finding solution to the MSTSP. Our proposed SGA uses sequential sampling algorithm along with 2-opt search for initial population generation, sequential constructive crossover, and adaptive mutation. The hybrid genetic algorithms (HGAs) include a selected local search and pertu`rbation procedure to the proposed SGA. Each HGA uses one of the local search procedures based on insertion, inversion, and swap operators. Generally, perturbation procedure is used to overcome premature convergence issue. The partially mapped crossover [15] along with swap mutation for perturbation procedure is to find better quality solution to the MSTSP.
The usefulness of our proposed HGAs have been examined amongst themselves and calculated percentage of improvement of the obtained solution over the solution obtained by SGA for the asymmetric and symmetric TSPLIB problem instances. The experimental results show a very good improvement of the solutions by HGAs over the solutions by SGA. Further, it is seen that for asymmetric instances, HGA3 is placed in 2 nd position and HGA4 is the best one. For symmetric instances, HGA2 is placed in 2 nd position and HGA4 is the best one. Overall, for both categories of the instances, HGA4 is the best one, HGA2 is the 2 nd best and HGA3 is the 3 rd best. Finally, our HGA4 is compared against multi-start iterated local search (MS-ILS(h1+h2)) [4] by solving some TSPLIB symmetric instances of different sizes. Our computational experience reveals that our HGA4 is better than MS-ILS(h1+h2). This paper is arranged as follows: A literature survey for the MSTSP is provided in Section II. Section III develops simple and hybrid GAs for the problem, while Section IV reports computational experience of the developed algorithms. Finally, Section V provides conclusion and forthcoming research works.

II. LITERATURE REVIEW
The MSTSP is a difficult NP-hard problem. Methods to solve this kind of optimization problems are grouped into two broad groupsexact and heuristic methods ( [18]- [19]). There are very less literatures on the MSTSP. The first procedure for solving the problem is developed by Arkin et al. [1]. They proved that the MSTSP is NP-hard, and no constant-factor approximation procedure can be devised unless NP = P. A factor-2 (claimed to be best) approximation procedure is developed for max-min 1-neighbour TSP satisfying the triangle inequality for path and cycle adaptations. Further, they developed procedures for the max-min 2-neighbour TSP satisfying triangle inequality for cycle and path adaptations. Finally, the procedures extended to find an approximation solution of the max-min m-neighbour TSP for path version.
Approximation procedures for max-min 2-neighbour TSP with triangle inequality was developed by Chiang [20] for the cycle and path adaptations by improving the procedures in [1]. As reported, both procedures are very simple. Some studies on the MSTSP and its related versions are reported by John [7].
An approximation procedure for the MSTSP satisfying triangle inequality was developed by Kabadi and Punnen [21] that claimed to find the best bound for this case.
An improved procedure of the procedure in [1] was proposed for the points on a line to a regular mXn-grid by Hoffmann et al. [22] that claimed to obtain optimal solutions. They further claimed that the procedure takes linear computational effort to obtain optimal tour in some cases.
The multi-salesmen MSTSP called multiple MSTSP (MMSTSP) was proposed by Dong et al. [23]. They proposed three improved GAs for the problem. Their improved algorithms used greedy initialization, simulated annealing, and hill-climbing algorithms. As reported, their algorithms are effective algorithms that can expose various characteristics to find solution of the problem. www.ijacsa.thesai.org insertion, two local search procedures were proposed in their procedure. As reported, their algorithm found very good results on some symmetric TSPLIB instances.
In [16], eight GAs were developed using eight crossover methods for the MSTSP. A comparative study was reported on some TSPLIB asymmetric and symmetric instances. It was showed that the sequential constructive crossover (SCX) is the best, partially mapped crossover (PMX) is the second-best and greedy crossover (GX) is the worst.
It is mentioned that the BTSP is very close to the MSTSP. Lexisearch approaches were developed for the BTSP in ( [24], [25]). Further hybrid algorithms were developed for the BTSP in ( [26], [27]). The MaxTSP is also close to the MSTSP for which a hybrid GA is developed for finding solution to the problem [28].
Since there are a few literature on the hybrid algorithms on the MSTSP, hence we propose to develop hybrid genetic algorithms to show the efficiency of the hybrid algorithms in solving the problem.

III. HYBRID GENETIC ALGORITHMS FOR THE MSTSP
Genetic algorithms (GAs) are established to be effective for the traditional TSP and its some variants. Though they do not assure the optimality of their obtained solutions, they normally obtain very close optimal solutions rapidly. In this section, we develop a simple GA (SGA) and four hybrid GAs (HGAs) for the MSTSP.

A. Chromosome Representation
The first job in GAs is to determine a chromosome representation procedure for representing solutions of a problem so that GA operators can produce feasible chromosome(s). For TSP and its variants, mainly path representation is used which lists cities so that no city is duplicated in a chromosome. We consider this path representation for the MSTSP. As an example, let {1, 2, 3, 4, 5, 6, 7, 8} be the cities in an 8-city problem, and the chromosome (1,3,2,7,8,6,4,5) denotes the tour {1→3→2→7→8→6→4→5 →1} whose objective as well as fitness function is the shortest edge in this tour. As MSTSP is a maximization problem, a higher fitness value is better than the lower fitness value.

B. Improved Initial Population
Starting with an improved initial population can provide good solutions quickly. We use sequential sampling approach [26] for generating initial population for our GAs, that was successfully applied on other TSP variants ([27]- [28]). In sequential sampling approach, first alphabet table is constructed based on the given distance (cost) matrix, then the probability of visiting every un-visited city is allocated in each row such that first un-visited city is allocated higher probability than probability of 2nd city, then 2nd one is allocated higher than the 3rd city, and so forth. For each un-visited city in that row, cumulative probability is also calculated. The city is accepted that represents a randomly generated number in a cumulative probability interval. This process is repeated until a valid chromosome is created. This way, a population of given size is generated. However, it is observed that this approach cannot search all space. So, to improve the initial population, we apply 2-opt search to every chromosome for enhancing the population. However, if the newly obtained chromosome is better than the old one, replace it by the new one, otherwise, no action is taken. Due to the strong capability of 2-opt local search, it can improve the search space of our proposed algorithm.

C. Selection Strategy
The selection strategy is the procedure of choosing parents from the current population for the next operation. In selection operation, no new chromosome is created, only some of the fitter chromosomes are passed to the breeding pool for the subsequent operation/generation. By selecting a greater section of fitter chromosomes, this operation simulates the Darwinian hypothesis of survival-of-the-fittest in biology. Normally, the proportionate selection is used where a chromosome is chosen depending on its probability of selection. We use stochastic remainder selection procedure [29] for the proposed GAs. In this procedure, first 'expected count' of every individual is computed by dividing their corresponding fitness value with the average fitness value. Then as many individuals are copied equal to the mantissa of the expected counts, and then mantissas are subtracted from the corresponding expected counts. This will result the values of the expected counts less than one. If a randomly generated number is less than the expected count of a selected individual, then the individual is inserted into the mating pool. Repeat this procedure till the number of chromosomes is equal to the size of population. Note that population size is the number of chromosomes in the population.

D. Crossover Operator
Crossover operator performs a very big role in GAs, where two parent chromosomes as well as a crossover point within the chromosomes' length are chosen and the chromosomes' data after the crossover point are exchanged. Quite a few good crossover methods are available in the literature for traditional TSP that might be applied for the MSTSP. Ahmed [16] applied eight crossover operators, namely, ordered crossover [30], partially mapped crossover [31], cycle crossover [32], alternating edges crossover [33], generalized N crossover [34], greedy crossover [33], edge recombination crossover [35], sequential constructive crossover [15] on the MSTSP, and reported a comparative study among them. As reported, sequential constructive crossover (SCX) is observed as the best method. We also apply this SCX in our proposed GAs. The steps of SCX algorithm are as follows [16]: Step 1: Start from 'city 1' (i.e., current city p =1).
Step 2: Search sequentially both parent chromosomes and take the first un-visited city emerged after 'city p' in the parents. If no un-visited city after 'city p' is available in a parent chromosome, search from beginning of the chromosome and take the first un-visited city and go to Step 3.
Step 3: Suppose 'city α' and 'city β' are in 1 st and 2 nd parents correspondingly, then for choosing the following city go to Step 4. www.ijacsa.thesai.org Step 4: If dpα > dpβ, then choose 'city α', otherwise, 'city β' as subsequent city and merge it to the current offspring. If the offspring is a full chromosome, stop, else, the current city is renamed as 'city p', go to Step 2.
Let us illustrate the SCX using a 7-city instance with distance matrix provided in Table I. Let P1: (1,5,3,2,7,4,6) and P2: (1, 5, 7, 3, 6, 2, 4) be parent chromosomes with costs 3 and 2 respectively. Our computation is started from the city 1 (headquarters). After city 1, city 5 in both P1 and P2 is the un-visited city, city 5 is added that produces the offspring as (1,5). After city 5, cities 3 in P1 and 7 in P2 are un-visited cities with costs c53=3 and c57=7. Since c57>c53, city 7 is added that produces the offspring as (1,5,7). After city 7, cities 4 in P1 and 3 in P2 are un-visited cities with costs c74=7 and c73=3. Since c74>c73, city 4 is added that produces the offspring as (1,5,7,4). After city 4, city 6 in P1 with costs c46=13, but no city in P2. So, for P2, search from the beginning and finds un-visited city 3 with c43=11. Since c46>c43, city 6 is added that produces the offspring as (1,5,7,4,6). After city 6, no city is present in P1 and un-visited city 2 is present in P2 with cost c62=13. So, for P1 search from the beginning and finds un-visited city 3 with c63=8. Since c62>c63, city 2 is added that produces the offspring as (1,5,7,4,6,2). Finally, after city 2, the only remaining city is 3, which is added that produces the offspring as (1, 5, 7, 4, 6, 2, 3) with cost 7 is obtained. Fig. 2 shows parents (P1 and P2) and offspring (O) chromosomes. In general, the crossover operator which preserves better characteristics of parents in their children is expected to be better, and SCX is expected to be better in this regard. In Fig. 2(c), bold five edges are from either parent. Though SCX is observed as the best method, however, sometimes it creates bad offspring. So, to maintain a mixture of offspring and parent in a population, we replace the 1 st parent by the offspring if it is better. In addition, the 2-opt local search is used on the better offspring to improve it further. Since the SCX operator produces only an offspring. So, to keep population size same in all generations, when selecting next pair for crossover, the present 2 nd chromosome will be selected as the 1 st parent and the 3 rd chromosome will be as the 2 nd parent, and so forth.

E. Mutation Operator
As some weaker chromosomes are omitted in selection and crossover processes in any generation, so, there might be some stronger chromosomes' structures which were lost forever. So, normally, mutation is applied to regain them. In traditional mutation operations, a gene (position) is chosen arbitrarily in a chromosome, then alters its subsequent allele (city). Some of the mutation operators are inversion mutation, insertion mutation, swap mutation, adaptive mutation [36]. The adaptive mutation is implemented for our GAs. To perform this mutation, the information from the chromosomes in a population are gathered to identify a structure amongst them. If mutation is to be performed, then the chromosomes which do not match the structure would be muted. The steps of adaptive mutation are as follows: Step 1: In the current population, take all chromosomes.
Step 2: Construct a one-dimensional array of order n, let A, by adding a city which appears least time in the present position of all chromosomes.
Step 3: If the mutation is allowed, two genes are selected arbitrarily so that they are not same in the subsequent positions of the array, A, then they are exchanged. Since the city 1 is always fixed in the 1 st position, we exclude the 1 st position as well as the city 1 in this procedure. For example, suppose the chromosome P: (1, 5, 7, 4, 6, 2, 3) is chosen for mutation operation, and the array be A: [2,6,3,5,6,4,7]. Let 3 rd and 5 th positions are chosen arbitrarily. The 3 rd position's gene '7' is same as the subsequent element in A, but 5 th position's gene '6' is same as the subsequent element in A. So, we choose another position arbitrarily and let, 7 th position is chosen that allows the swap. Hence, the mutated chromosome would be P': (1, 5, 3, 4, 6, 2, 7) that is showed in Fig. 3. New edges in the muted chromosome are shown in boldfaces in Fig. 3(b). www.ijacsa.thesai.org

F. Local Search
There are various local search procedures presented in the literature, amongst them combined mutation is seen as a nice local search procedure ( [2], [27], [28]). It merges insertion, inversion, and swap mutations with 1.00 probabilities. Insertion mutation selects a city in a chromosome, then inserts into an arbitrary position. Inversion mutation selects two positions in a chromosome, then inverts the sub-chromosome between them. Swap mutation selects two cities (genes) arbitrarily and exchanges them. We define these three mutations as local search procedures in our HGA as follows. (1, 2, 3, ...., n) be a chromosome. The insertion search may be defined as:

1) Insertion search: Suppose
Step 0: For i: = 2 to n-1 perform the next step.
Step 1: For j: = i+1 to n perform the next step.
Step 2: If inserting city i after city j improves the assignment cost, then insert the city i after the city j.
Step 1: For j: = i+1 to n perform the next step.
Step 2: If inverting sub-chromosome between the cities i and j improves the assignment cost, then invert the subchromosome.
Step 1: For j: = i+1 to n perform the next step.
Step 2: If exchanging cities i and j improves the assignment cost, then swap them.
In our local search procedure, one of these three local search is selected for our HGA for the problem.

G. Perturbation Procedure
Though GAs are very good methods, but sometimes, they get stuck in local optima. This may be due to identical population, and so, the population must be varied. Perturbation procedure is useful in escaping from local optima. If (Best Solution -Average Solution) < 0.10*Best Solution, then we apply partially mapped crossover (PMX), swap mutation and combined mutation operators. The PMX selects two crossover points, describes swap mappings in the segment between these points, and delivers two offspring. Further, mutation can assist other operators to beat local optima issue and thus, can find better solutions.

H. Hybrid GAs
In our study, a simple genetic algorithm (SGA) and four hybrid genetic algorithms (HGAs) are proposed for the MSTSP. The SGA starts with initial population generated by sequential sampling approach which is further improved by 2opt search, and it is tried to improve gradually the population through stochastic remainder selection, sequential constructive crossover, and adaptive mutation. A stopping condition of maximum generation is adopted. The hybrid genetic algorithms (HGAs) include a selected local search and perturbation procedure to the proposed SGA. When the stopping condition is satisfied, near-optimal solution is produced. The selected local search defines each proposed HGA as follows: The common structure of our proposed HGAs is presented in Fig. 4.    Avg. Imp(%) = 100(S1 -S2)/S2 , where S1 and S2 are average solutions by a HGA and the SGA, respectively.
The Table III summarizes the results of asymmetric instances of sizes from 34 to 403. From the table, it is observed that the SGA could not find either best solution or best average solution for any instance. All four HGAs together obtained best average solutions with least S.D. for three instances ftv33, ftv35 and p43. In addition, HGA2, HGA3 and HGA4 together obtained best average solutions with lowermost S.D. for the instance kro124p; HGA3 and HGA4 together obtained best average solutions for ftv44, ftv70 and rbg403; HGA1 obtained best average solution with least S.D. for the instance ftv38; HGA2 obtained best average solution with lowest S.D. for the instance ft53; HGA3 obtained best average solutions for ftv47, ry48p, ft70 and rbg323; HGA4 obtained best average solutions with least S.D. for the instances ftv55, ftv64, ftv170 and rbg358. From this experiment we can say that HGA3 and HGA4 are competing, however, HGA4 is observed as the best algorithm.
By looking at the average improvement (%) of the average solutions by HGAs, we have the same conclusion. The average improvements (%) are shown in Fig. 6 that also indicates the suitability of the HGAs, specially HGA3 and HGA4. Looking at the overall results on the asymmetric instances, one can decide that the HGA4 is the best one and HGA3 is the 2 nd best one. It is very apparent from the above experiments that HGAs have very good improvements in the solutions over SGA for the TSPLIB asymmetric instances. HGA4 is observed as the best algorithm and SGA is the worst. Now, to verify whether average solutions obtained by HGA4 is significantly and statistically different from the average solutions obtained by remaining HGAs, Student's t-test is conducted using following formula [38]. It may be noted that 20 runs have been conducted for each instance.  ftv33  ftv35  ftv38  p43  ftv44  ftv47  ry48p  ft53  ftv55  ftv64  ft70  ftv70  kro124p  ftv170  rbg323  rbg358  rgb403 Average Improvent (%) Instances HGA1 HGA2 HGA3 HGA4 www.ijacsa.thesai.org Here, ̅ 2 and 2 values are obtained by HGA4, and ̅ 1 and 1 values are obtained by other HGAs.
The t-statistic results are provided in Table IV. The t-values may be negative or positive. As the MSTSP is a maximization problem, negative value indicates that HGA4 obtained better solution than its competitive HGA, and positive value indicates that the competitive HGA obtained better solution than HGA4. Here 95% confidence level (t0.05 = 1.73) is applied, so, if tvalue is bigger than 1.73, their difference is significant. In this condition if t-value is negative then HGA4 is better, else competitive HGA is better. If t-value is smaller than 1.73, then they have no statistical and significant difference. The table further reports the name of the better HGA.  Better Better

HGA4 HGA4 HGA4
On ten instances, HGA4 and HGA1 have no significant and statistical difference. On the other seven instances HGA4 is better than HGA1. On twelve instances, HGA4 and HGA2 have no significant and statistical difference. On the instance ft53, HGA2 is better than HGA4, and on the remaining four instances HGA4 is better than HGA2. On fourteen instances, HGA4 and HGA3 have no significant and statistical difference, and on the remaining three instances, HGA4 is better than HGA3. On all seventeen instances, HGA4 is found better than other HGAs. From this experiment we can say that HGA4 is the best for asymmetric instances.
The Table V summarizes the results of symmetric instances of sizes from 21 to 318. From the table, it is observed that the SGA could obtain best solution for only the instance gr21. All HGAs obtained best average solutions with lowest S.D. for four instances gr21, fri26, bayg29 and berlin52. In addition, HGA2 and HGA4 together obtained best average solutions with lowest S.D. for the instances kroA150 and a280; HGA3 and HGA4 together obtained best average solutions with lowest S.D. for the instance si175; HGA2 obtained best average solutions for three instances -gr48, st70 and pr76; HGA3 obtained best average solutions for the instance dantzig42; HGA4 obtained best average solutions for five instances -eil51, lin105, ch130, d198, pr226 and lin318. From this study we can say that HGA4 is the best one. By looking at the average improvement (%) of the average solutions by HGAs, we can have the same conclusion. These results are shown in Fig. 7 that also shows the usefulness of the HGAs, specially HGA2 and HGA4. Looking at the overall results on the symmetric instances, one can conclude that the HGA4 is the best one and HGA2 is the second best one.
From the experiment we can say that HGAs have very good improvements in the solution over SGA for the TSPLIB symmetric instances. HGA4 is found to be the best and SGA is the worst. Now, to verify whether average solutions obtained by HGA4 is significantly and statistically different from the average solutions obtained by other HGAs, Student's t-test is conducted, and the results are provided in Table VI.     Table VI, on five instances, HGA4 and HGA1 have no statistical and significant differences, and on the other twelve instances HGA4 is better than HGA1. On nine instances there is no statistically significant difference between HGA4 and HGA2; on the instance pr76, HGA2 is better than HGA4; and on the remaining seven instances, HGA4 is better than HGA2. On seven instances there is no statistically significant difference between HGA4 and HGA3, and on remaining ten instances, HGA4 is better than HGA3. On all seventeen instances, HGA4 is found better than other HGAs.
From this experiment, we can say that HGA4 is the best algorithm for symmetric TSPLIB instances also. Hence, for all asymmetric and symmetric instances, HGA4 is the best algorithm. To decide the second best algorithm for both categories of instances, we performed Student's t-test between HGA2 and HGA3 and reported in Table VII. From the table, it is found that out of thirty four instances, on nineteen instances, HGA2 and HGA3 have no statistical and significant differences. On the other nine instances HGA2 is better than HGA3, and on six instances, HGA3 is better than HGA2. Hence, HGA2 is the second best and HGA3 is the third best.
We now compare our proposed HGA4 with a state-of-art algorithm, namely, multi-start iterated local search (MS-ILS(h1+h2)) [4] on some TSPLIB symmetric instances of sizes from 21 to 318. We record best solution (BS), worst solution (WS), average solution (AS), and computational time (Time) (in seconds) for each problem instance in Table VIII. Better solutions are shown in boldfaces.
Looking at the average solutions, for the four instances, namely, dantzig42, gr48, lin105 and lin318 our HGA4 could find better solutions than solutions found by MS-ILS(h1+h2).
For another two instances, namely, st70 and pr226, solutions by MS-ILS(h1+h2) are better. For the remaining instances, solutions are same. Of course, MS-ILS(h1+h2) takes less computational time. Overall, looking at the solution quality, our suggested HGA4 is found to be better. In this paper, a simple GA (SGA) and four hybrid GAs (HGA1, HGA2, HGA3 and HGA4) have been proposed for solving the MSTSP. The SGA used initial population by a sequential sampling, a proportionate selection, sequential constructive crossover, and adaptive mutation. Three local search procedures based on inversion, insertion and swap mutations, and a perturbation procedure have been used in different HGAs. The usefulness of the HGAs have been examined amongst themselves and calculated percentage of improvement of the obtained solution over the solution by SGA for the asymmetric and symmetric TSPLIB problem instances. The results show a very good improvement of the solutions by HGAs over the solutions by SGA. Further, it is seen that for asymmetric instances, HGA3 is placed in 2 nd position and HGA4 is the best one. For symmetric instances, HGA2 is placed in 2 nd position and HGA4 is the best one. Overall, for both categories of the instances, HGA4 is the best one, HGA2 is the second best and HGA3 is the 3 rd best. Further, a comparative study is carried out between HGA4 and by multi-start iterated local search (MS-ILS(h1+h2)). Looking at the solution quality, our suggested HGA4 is found to be better.
Though our proposed HGAs obtained very efficient solutions with slight differences between best solutions and average solutions, however, we admit that still there is an opportunity to improve the solutions by combining better local search procedures, another heuristic method and perturbation procedure to the instances that is under our study.