Optimisation problems exist in a variety of scientific fields, varying from medicine to agriculture. While conventional optimisation algorithms are popular, they suffer from drawbacks such as getting stuck in local optima and being sensitive in the initial state . To tackle these problems, population-based metaheuristic algorithms such as particle swarm optimisation  offer a powerful alternative thanks to their well-recognised characteristics, such as self-adaptation and being derivative free .
The performance of DE is directly related to these operators . Among them, the mutation operator plays a crucial role to generate new promising candidate solutions and significant recent work has focussed on developing effective mutation operators.  proposes a multi-population DE which combines three different mutation strategies including current-to-pbest/1, current-to-rand/1, and rand/1.  employs three trial vector generation strategies and three control parameter settings and randomly selects between them to create new vectors. In , -tournament selection is used to introduce selection pressure for selecting the base vector.  proposes a neighbourhood-based mutation that is performed within each Euclidean neighbourhood. In , a competition scheme for generating new candidate solutions is introduced so that candidate solutions are divided into two groups, losers and winners. Winners create new candidate solutions based on standard mutation and crossover operators, while losers try to learn from winners.
In this paper, we propose a novel DE algorithm, Clu-DE, which employs a novel clustering-based mutation operator. Inspired by the clustering operator in the human mental search (HMS) optimisation algorithm , Clu-DE clusters the current population into groups and selects a promising region as the cluster with the best mean objective function value. The best candidate solution in the promising region is selected as the base vector in the mutation operator. An updating strategy is then employed to include the new candidate solutions into the current population. Experimental results on CEC-2017 benchmark functions with dimensionalities of 30, 50 and 100 confirm that Clu-DE yields improved performance compared to DE.
Ii-a Differential Evolution
Differential evolution (DE)  is a simple but effective population-based optimisation algorithm based on three main operators: mutation, crossover, and selection.
The mutation operator generates a mutant vector for each candidate solution as
where , , and are three distinct candidate solutions randomly selected from the current population and is a scaling factor.
Crossover shuffles the mutant vector with the parent vector. For this, binomial crossover, defined as
is employed, where is called a trial vector, is the crossover rate, and is a random integer number between 1 and the number of dimensions.
Finally, the selection operator selects the better candidate solution from the new candidate solution and its parent to be passed to the new population.
Clustering is an unsupervised pattern recognition technique to partition samples into different groups so that the members of a cluster share more resemblance compared to members of different clusters. -means  is the most popular clustering algorithm based on a similarity measure (typically Euclidean distance). It requires to define , the number of clusters, in advance and proceeds as outlined in Algorithm LABEL:Alg1.
Iii Proposed Clu-DE Algorithm
In this paper, we improve DE using a novel clustering-based mutation and updating scheme. Our proposed algorithm, Clu-DE, is given, in the form of pseudo-code in Algorithm LABEL:Alg2, while in the following we describe its main contributions.
|F1||Shifted and Rotated Bent Cigar Function|
|F2||Shifted and Rotated Sum of Different Power Function|
|F3||Shifted and Rotated Zakharov Function|
|F4||Shifted and Rotated Rosenbrock’s Function|
|F5||Shifted and Rotated Rastrigin’s Function|
|F6||Shifted and Rotated Expanded Scaffer’s Function|
|F7||Shifted and Rotated Lunacek Bi_Rastrigin Function|
|F8||Shifted and Rotated Non-Continuous Rastrigin’s Function|
|F9||Shifted and Rotated Levy Function|
|F10||Shifted and Rotated Schwefel’s Function|
|Hybrid multimodal functions|
|F11||Hybrid Function 1 ()|
|F12||Hybrid Function 2 ()|
|F13||Hybrid Function 3 ()|
|F14||Hybrid Function 4 ()|
|F15||Hybrid Function 5 ()|
|F16||Hybrid Function 6 ()|
|F17||Hybrid Function 7 ()|
|F18||Hybrid Function 8 ()|
|F19||Hybrid Function 9 ()|
|F20||Hybrid Function 10 ()|
|F21||Composition Function 1 ()|
|F22||Composition Function 2 ()|
|F23||Composition Function 3 ()|
|F24||Composition Function 4 ()|
|F25||Composition Function 5 ()|
|F26||Composition Function 6 ()|
|F27||Composition Function 7 ()|
|F28||Composition Function 8 ()|
|F29||Composition Function 9 ()|
|F30||Composition Function 10 ()|
Iii-a Clustering-based Mutation
For our improved mutation operator, Clu-DE first identifies a promising region in search space. This is performed, similar as in the HMS algorithm , using a clustering algorithm. We employ the well known -means clustering algorithm to group the current population into clusters so that each cluster represents a region in search space. The number of clusters is selected randomly between 2 and [4, 16].
After clustering, the mean objective function value for each cluster is calculated, and the cluster with the best objective function value is then used to identify a promising region in search space. Fig. 1 illustrates this for a toy problem with 17 candidate solutions divided into three clusters.
Finally, our novel clustering-based mutation is conducted as
where and are two different randomly-selected candidate solutions, and is the best candidate solution in the promising region. It is worth noting that the best candidate solution in the winner cluster might not be the best candidate solution in the current population. Clustering-based mutation is performed times following standard crossover and mutation.
Iii-B Population Update
After generating new offsprings using clustering-based mutation, the population is updated for which we employ a scheme based on the generic population-based algorithm (GPBA) . In particular, the population is updated in the following manner:
Selection: candidate solutions are selected randomly. This corresponds to the initial seeds for -means clustering.
Generation: new candidate solutions are created as set . This is conducted by the clustering-based mutation.
Replacement: candidate solutions are selected randomly from the current population as set B.
Update: From , the best individuals are selected as . The new population is then obtained as .
Iv Experimental results
To verify the efficacy of Clu-DE, we perform experiments on the CEC2017 benchmark functions , 30 functions with different characteristics including unimodal functions, multi-modal functions, hybrid multi-modal functions, and composite functions, summarised in Table I.
In all experiments, the maximum number of function evaluations is set to , where is the dimensionality of the search space.
The population size, crossover rate, and scaling factor are set to 50, 0.9, and 0.5, respectively. For Clu-DE,
is set to 10. Each algorithm is run 25 times independently, and we report mean and standard deviation over 25 runs.
To evaluate if there is a statistically significant difference between two algorithms, a Wilcoxon signed-rank test 
is performed with a confidence interval of 95% on each function.
Table II gives the results of Clu-DE compared to standard DE for . From the table, we can see that Clu-DE statistically outperforms DE for 16 of the 30 functions, while obtaining equivalent performance for 12 functions. Only for two of the multi-modal functions Clu-DE yields inferior results.
|wins/ties/losses for Clu-DE||16/12/2|
When increasing the number of dimensions to 50, for which the results are listed in Table III, Clu-DE retains its efficacy. As can be seen, it statistically outperforms standard DE for 12 of the 30 functions, while giving similar results for 16 functions.
|wins/ties/losses for Clu-DE||12/16/2|
For , the results are given in Table IV. As we can see from there, Clu-DE obtains better or similar results for 24 of the 30 functions, thus clearly outperforming DE also for high-dimensional problems.
|wins/ties/losses for Clu-DE||14/10/6|
Last but not least, Figure 2 shows convergence curves of our proposed algorithm compared to DE for, as representative examples, F10 and F15 and all dimensionalities. As we can observe, Clu-DE converges faster than standard DE.
In this paper, we have proposed a novel differential evolution algorithm, Clu-DE, based on a novel clustering-based mutation operator. A promising region in search space is found using -means clustering and some new candidate solutions are generated using the proposed cluster-based mutation. A population update scheme is introduced to include the new candidate solutions into the current population. Extensive experiments on the CEC-2017 benchmark functions and for dimensionalities of 30, 50 and 100 verify that Clu-DE is a competitive variant of DE. In future work, we intend to extend Clu-DE for multi-objective optimisation problems.
Wavelets optimization method for evaluation of fractional partial differential equations: an application to financial modelling. Advances in Difference Equations 2018 (1), pp. 8. Cited by: §I.
Differential evolution-based neural network training incorporating a centroid-based strategy and dynamic opposition-based learning. In
IEEE Congress on Evolutionary Computation, pp. 2958–2965. Cited by: §I.
-  (2019) Adaptive k-tournament mutation scheme for differential evolution. Applied Soft Computing 85, pp. 105776. Cited by: §I.
-  (2011) A clustering-based differential evolution for global optimization. Applied Soft Computing 11 (1), pp. 1363–1379. Cited by: §III-A.
-  (2009) Automatic image pixel clustering with an improved differential evolution. Applied Soft Computing 9 (1), pp. 226–236. Cited by: §I.
-  (2005) A population-based algorithm-generator for real-parameter optimization. Soft Computing 9 (4), pp. 236–253. Cited by: §III-B.
-  (2011) A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm and Evolutionary Computation 1 (1), pp. 3–18. Cited by: §IV.
-  (2020) Self-adaptive collective intelligence-based mutation operator for differential evolution algorithms. The Journal of Supercomputing 76 (2), pp. 876–896. Cited by: §I.
-  (1995) Particle swarm optimization (PSO). In IEEE International Conference on Neural Networks, pp. 1942–1948. Cited by: §I.
Some methods for classification and analysis of multivariate observations.
5th Berkeley Symposium on Mathematical Statistics and Probability, pp. 281–297. Cited by: §II-B.
-  (2017) Human mental search: a new population-based metaheuristic optimization algorithm. Applied Intelligence 47 (3), pp. 850–887. Cited by: §I, §III-A.
-  (2020) Many-level image thresholding using a center-based differential evolution algorithm. In Congress on Evolutionary Computation, Cited by: §I.
-  (2019) Differential evolution algorithm based on a competition scheme. In 14th International Conference on Computer Science and Education, Cited by: §I.
-  (2020) CenPSO: a novel center-based particle swarm optimization algorithm for large-scale optimization. In International Conference on Systems, Man, and Cybernetics, Cited by: §I.
-  (2020) Evolving feedforward neural networks using a quasi-opposition-based differential evolution for data classification. In IEEE Symposium Series on Computational Intelligence, Cited by: §I.
-  (2021) RDE-OP: a region-based differential evolution algorithm incorporation opposition-based learning for optimising the learning process of multi-layer neural networks. In 24th International Conference on the Applications of Evolutionary Computation, Cited by: §I, §III-A.
-  (2019) A global-best guided human mental search algorithm with random clustering strategy. In International Conference on Systems, Man and Cybernetics, pp. 3174–3179. Cited by: §I.
-  (2012) Differential evolution with neighborhood mutation for multimodal optimization. IEEE Transactions on Evolutionary Computation 16 (5), pp. 601–614. Cited by: §I.
Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces. Journal of Global Optimization 11 (4), pp. 341–359. Cited by: §I, §II-A.
-  (2019) A differential evolution-oriented pruning neural network model for bankruptcy prediction. Complexity 2019. Cited by: §I.
-  (2011) Differential evolution with composite trial vector generation strategies and control parameters. IEEE Transactions on Evolutionary Computation 15 (1), pp. 55–66. Cited by: §I.
-  (2016) Problem definitions and evaluation criteria for the CEC 2017 competition on constrained real-parameter optimization. Technical report Nanyang Technological University, Singapore. Cited by: TABLE I, §IV.
-  (2016) Differential evolution with multi-population based ensemble of mutation strategies. Information Sciences 329, pp. 329–345. Cited by: §I.