Many optimisation problems can be formulated as searching for the best element or solution from all feasible solutions with respect to one or several criteria. While conventional optimisation algorithms like gradient-based methods are rooted in a solid mathematical formulation, they typically fail to provide satisfactory results and often stagnate in local optima ,,.
Population-based metaheuristic algorithms are able to address this issue 5]
, are inspired by biological evolutionary processes. Swarm-based algorithms, e.g. particle swarm optimization (PSO) and artificial bee colony (ABC) , imitate the social behaviour of animals, while physics-based algorithms, including the gravitational search algorithm (GSA)  and the mine blast algorithm (MBA) , are based on the laws of physics.
Human Mental Search (HMS)  is a relatively recent population-based metaheuristic algorithm that is based on the concept of exploring the search space of online auctions. Here, mental search, a main part of the algorithm, employs Levy flight to explore the vicinity of candidate solutions. HMS has been shown to solve effectively a wide range of optimisation problems, including unimodal, multi-modal, high-dimensional, rotated, shifted, and complex functions , as well as various machine vision applications including multi-level thresholding [11, 12], colour quantisation [13, 14], image segmentation , and image clustering .
HMS algorithm has three main operators, mental search to explore the vicinity of candidate solutions based on a Levy flight distribution, grouping to cluster the current population to find a promising region, and movement to move candidate solutions towards the promising region. Several improvements to HMS have been recently introduced, including leveraging a random clustering strategy , and grouping in both search and objective space .
In the grouping phase of HMS, a clustering algorithm is employed and the cluster with the best mean objective function value is selected as the winner cluster. However, this approach has a tendency of getting stuck in a local optimum. To address this issue, in this paper, we propose a novel HMS algorithm, MCS-HMS, that is based on multi-cluster selection. Here, the best candidate solutions in each cluster have a chance of being selected. We further employ a more efficient one-step -means algorithm for clustering. Extensive experiments on various benchmark functions with different characteristics show that MCS-HMS outperforms HMS as well as other population-based metaheuristic algorithms.
Ii Human Mental Search
HMS is a metaheuristic algorithm inspired by the way the human mind searches. Each candidate solution, called a bid in HMS, is initially randomly generated. HMS then proceeds, iteratively, based on three main operators, mental search, grouping, and movement.
During mental search, a number of new bids are obtained as
where is a current bid, and
where represents the number of objective function evaluations thus far, is the maximum number of function evaluations, is a random integer number, and is the best bid found so far. and
are two random numbers with normal distributions as
where is a standard gamma function.
The grouping operator uses the -means algorithm to cluster similar bids in the population. Then, for each cluster the mean objective function value is calculated and the cluster with the best value is selected as the winner cluster to represent a promising area in search space.
Bids in the other clusters then towards the identified promising area by
where is the best solution in the promising area, is a constant, is a random number in , indicates the current iteration, and subscript indicates the -th element of a bid.
Iii Proposed MCS-HMS Algorithm
One of the drawbacks of the standard HMS algorithm is that selecting the cluster with the best mean objective functions does not necessarily give the best choice for the promising area because the cluster might be placed far away from the global optimum. Therefore, choosing the best cluster and its best bid may limit diversification and may cause the next generations to be misled to a less successful area. Therefore, in this paper, we introduce a novel modification of HMS to tackle this issue and find a better promising area.
Our idea is to select a few promising bids in a memory and use these. Specifically, instead of selecting the best bid in the cluster with the best mean objective function, we consider the best bid of each cluster as illustrated in Fig 1. The memory keeps only the bids with the best objective function value of each cluster, and thus the length of equals the number of clusters.
In the next step, we randomly select a bid from as the target for the movement operator and thus the corresponding cluster as the promising area in search space. In contrast to standard HMS, this mechanism leads to improved exploration since the target is selected from the whole search space, and can also prevent early convergence.
In addition, we address the relative inefficiency of HMS due to its application of -means in the grouping process. Inspired by , In MCS-HMS we replace -means with a one-step -means algorithm. That is only a single iteration of -means is conducted, leading to quicker execution.
Iv Experimental Results
For evaluation, we use the 30 CEC 2017 benchmark functions  which include unimodal (F1 to F3), multi-modal (F4 to F10), hybrid multi-modal (F11 to F20) and composite (F21 to F30) test functions. Besides comparing MCS-HMS with standard HMS, we also benchmark it against a number of other population-based metaheuristic algorithms, namely particle swarm optimisation (PSO) , the covariance matrix adaption-evolution strategy (CMA-ES) , artificial bee colony (ABC) , the grey wolf optimiser (GWO) , moth flame optimisation (MFO) , and the whale optimisation algorithm (WOA) .
As dimensionality of the search space we use 30, 50 and 100. The employed parameters of the various algorithms are given in Table I.
|PSO||inertia constant||1 to 0|
|HMS||number of clusters||5|
|minimum mental processes||2|
|maximum mental processes||5|
|MCS-HMS||same settings as HMS|
Since the employed methods are stochastic, we run all algorithms 25 times and report the mean over these runs in terms of the difference between the optimal value and that returned by an algorithm. The obtained results are given in Tables II, III, and IV for , and , respectively, and show impressively that MCS-HMS not only provides superior performance compared to standard HMS but that it also outperforms all the other algorithms.
Its superiority compared to standard HMS is further illustrated in Figs. 2, 3, and 4 which compared the rankings of HMS and MCS-HMS. From there, we can see that for the vast majority of cases, MCS-HMS is ranked top or second and that in particular, MCS-HMS gives the best result for 24, 25 and 27 functions for , and , respectively.
|avg. rank||best rank||worst rank||std.dev.|
show, for each algorithm, the average rank, best rank, worst rank and standard deviation over the 30 benchmark functions for, and , respectively.
|avg. rank||best rank||worst rank||std.dev.|
|avg. rank||best rank||worst rank||std.dev.|
As can be seen from there, MCS-HMS clearly yields the best performance for all dimensionalities, and by a significant margin. Table VIII further illustrates this by a pairwise comparison of MCS-HMS with all other methods. It is evident from there that in the large majority of cases, MCS-HMS outperforms the other approaches.
Last not least, we conduct a two-sided Wilcoxon signed rank test to see if the differences between MCS-HMS and its opponents are statistically significant and give the results in Table IX. With all -value below 0.05, it is apparent that MCS-HMS statistically outperforms all other algorithms.
|MCS-HMS vs. PSO||0.0148||0.0034||0.0132|
|MCS-HMS vs. CMA-ES||1.73E-06||4.73E-06||2.35E-06|
|MCS-HMS vs. ABC||2.13E-06||2.88E-06||4.29E-06|
|MCS-HMS vs. GWO||0.0082||0.0057||8.31E-04|
|MCS-HMS vs. MFO||1.73E-06||1.92E-06||1.73E-06|
|MCS-HMS vs. WOA||1.73E-06||1.73E-06||0.003|
|MCS-HMS vs. HMS||5.29E-04||1.15E-04||7.51E-05|
In this paper, we have introduced MCS-HMS, an enhanced version of the human mental search (HMS) optimisation algorithm. MCS-HMS employs a multi-cluster strategy in HMS’s grouping phase which allows for improved exploration ability, while also using a one-step -means algorithm for clustering to speed up the algorithm’s execution. Based on a set of experiments carried out on the CEC 2017 test functions, we show that MCS-HMS not only significantly improves the standard HMS algorithm but that it also outperforms a number of other population-based metaheuristics.
-  L. Cui, G. Li, Z. Zhu, Q. Lin, K.-C. Wong, J. Chen, N. Lu, and J. Lu, “Adaptive multiple-elites-guided composite differential evolution algorithm with a shift mechanism,” Information Sciences, vol. 422, pp. 122–143, 2018.
-  S. J. Mousavirad and S. Rahnamayan, “Differential evolution algorithm based on a competition scheme,” in 14th International Conference on Computer Science Education, 2019, pp. 929–934.
S. J. Mousavirad, A. A. Bidgoli, H. E. Komleh, and G. Schaefer, “A memetic imperialist competitive algorithm with chaotic maps for multi-layer neural network training,”International Journal of Bio-Inspired Computing, vol. 14, no. 4, p. 227–236, Jan. 2019.
-  K. Hussain, M. N. M. Salleh, S. Cheng, and Y. Shi, “Metaheuristic research: a comprehensive survey,” Artificial Intelligence Review, vol. 52, no. 4, pp. 2191–2233, 2019.
-  D. Whitley, “A genetic algorithm tutorial,” Statistics and Computing, vol. 4, no. 2, pp. 65–85, 1994.
Y. Shi and R. Eberhart, “A modified particle swarm optimizer,” in
IEEE International Conference on Evolutionary Computation, 1998, pp. 69–73.
-  D. Karaboga and B. Basturk, “A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm,” Journal of Global Optimization, vol. 39, no. 3, pp. 459–471, 2007.
-  E. Rashedi, H. Nezamabadi-Pour, and S. Saryazdi, “GSA: a gravitational search algorithm,” Information Sciences, vol. 179, no. 13, pp. 2232–2248, 2009.
-  A. Sadollah, A. Bahreininejad, H. Eskandar, and M. Hamdi, “Mine blast algorithm: A new population based algorithm for solving constrained engineering optimization problems,” Applied Soft Computing, vol. 13, no. 5, pp. 2592–2612, 2013.
-  S. J. Mousavirad and H. Ebrahimpour-Komleh, “Human mental search: a new population-based metaheuristic optimization algorithm,” Applied Intelligence, vol. 47, no. 3, pp. 850–887, 2017.
-  S. J. Mousavirad and H. Ebrahimpour-Komleh, “Human mental search-based multilevel thresholding for image segmentation,” Applied Soft Computing, vol. 97, p. 105427, 2020.
-  L. Esmaeili, S. J. Mousavirad, and A. Shahidinejad, “An efficient method to minimize cross-entropy for selecting multi-level threshold values using an improved human mental search algorithm,” Expert Systems with Applications, vol. 182, p. 115106, 2021.
-  S. J. Mousavirad, G. Schaefer, H. Fang, X. Liu, and I. Korovin, “Colour quantisation by human mental search,” in International Conference on Swarm Intelligence, 2020, pp. 130–141.
-  S. J. Mousavirad, G. Schaefer, M. E. Celebi, H. Fang, and X. Liu, “Colour quantisation using human mental search and local refinement,” in IEEE International Conference on Systems, Man, and Cybernetics, 2020, pp. 3045–3050.
-  S. J. Mousavirad, H. Ebrahimpour-Komleh, and G. Schaefer, “Automatic clustering using a local search-based human mental search algorithm for image segmentation,” Applied Soft Computing, vol. 96, p. 106604, 2020.
-  S. J. Mousavirad, H. Ebrahimpour-Komleh, and G. Schaefer, “Effective image clustering based on human mental search,” Applied Soft Computing, vol. 78, pp. 209–220, 2019.
-  S. J. Mousavirad, G. Schaefer, and I. Korovin, “A global-best guided human mental search algorithm with random clustering strategy,” in IEEE International Conference on Systems, Man and Cybernetics, 2019, pp. 3174–3179.
-  S. J. Mousavirad, G. Schaefer, I. Korovin, D. Oliva, M. Helali Moghadam, and M. Saadatmand, “HMS-OS: Improving the human mental search optimisation algorithm by grouping in both search and objective space,” in IEEE Symposium Series on Computational Intelligence, 2021.
-  S. J. Mousavirad, G. Schaefer, L. Esmaeili, and I. Korovin, “On improvements of the human mental search algorithm for global optimisation,” in IEEE Congress on Evolutionary Computation, 2020, pp. 1–8.
-  G. Wu, R. Mallipeddi, and P. Suganthan, “Problem definitions and evaluation criteria for the CEC 2017 competition on constrained real-parameter optimization,” National University of Defense Technology, Changsha, Hunan, PR China and Kyungpook National University, Daegu, South Korea and Nanyang Technological University, Singapore, Technical Report, 2017.
-  N. Hansen and A. Ostermeier, “Completely derandomized self-adaptation in evolution strategies,” Evolutionary Computation, vol. 9, no. 2, pp. 159–195, 2001.
-  S. Mirjalili, S. M. Mirjalili, and A. Lewis, “Grey wolf optimizer,” Advances in Engineering Software, vol. 69, pp. 46–61, 2014.
S. Mirjalili, “Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm,”Knowledge-Based Systems, vol. 89, pp. 228–249, 2015.
-  S. Mirjalili and A. Lewis, “The whale optimization algorithm,” Advances in Engineering Software, vol. 95, pp. 51–67, 2016.