QPSO-CD: Quantum-behaved Particle Swarm Optimization Algorithm with Cauchy Distribution

by   Amandeep Singh Bhatia, et al.

Motivated by particle swarm optimization (PSO) and quantum computing theory, we have presented a quantum variant of PSO (QPSO) mutated with Cauchy operator and natural selection mechanism (QPSO-CD) from evolutionary computations. The performance of proposed hybrid quantum-behaved particle swarm optimization with Cauchy distribution (QPSO-CD) is investigated and compared with its counterparts based on a set of benchmark problems. Moreover, QPSO-CD is employed in well-studied constrained engineering problems to investigate its applicability. Further, the correctness and time complexity of QPSO-CD are analysed and compared with the classical PSO. It has been proven that QPSO-CD handles such real-life problems efficiently and can attain superior solutions in most of the problems. The experimental results showed that QPSO associated with Cauchy distribution and natural selection strategy outperforms other variants in the context of stability and convergence.



There are no comments yet.


page 1

page 2

page 3

page 4


Duck swarm algorithm: a novel swarm intelligence algorithm

A swarm intelligence-based optimization algorithm, named Duck Swarm Algo...

A Generalized Hybrid Real-Coded Quantum Evolutionary Algorithm Based on Particle Swarm Theory with Arithmetic Crossover

This paper proposes a generalized Hybrid Real-coded Quantum Evolutionary...

Particle Swarm Optimization with Velocity Restriction and Evolutionary Parameters Selection for Scheduling Problem

The article presents a study of the Particle Swarm optimization method f...

Random Reselection Particle Swarm Optimization for Optimal Design of Solar Photovoltaic Modules

Renewable energy is becoming more popular due to environmental concerns ...

Implementation of Parallel Simplified Swarm Optimization in CUDA

As the acquisition cost of the graphics processing unit (GPU) has decrea...

A New Approach to the Solution of Economic Dispatch Using Particle Swarm Optimization with Simulated Annealing

A new approach to the solution of Economic Dispatch using Particle Swarm...

Particle Swarm Optimization: Stability Analysis using N-Informers under Arbitrary Coefficient Distributions

This paper derives, under minimal modelling assumptions, a simple to use...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

I Introduction

In the late 19th century, the theory of classical mechanics experienced several issues in reporting the physical phenomena of light masses and high velocity microscopic particles. In 1920s, Bohr’s atomic theory bohr1928quantum , Heisenberg’s discovery of quantum mechanics robertson1929uncertainty and Schrödinger’s wessels1979schrodinger discovery of wave mechanics influence the conception of a new field i.e. the quantum mechanics. In 1982, Feynman feynman1982simulating stated that quantum mechanical systems can be simulated by quantum computers in exponential time, i.e. better than with classical computers. Till then, the concept of quantum computing was thought to be only a theoretical possibility, but over the last three decades the research has evolved such as to make quantum computing applications a realistic possibility wang2012handbook .

In the last two decades, the field of swarm intelligence has got overwhelming response among research communities. It is inspired by nature and aims to build decentralized and self-organized systems by collective behavior of individual agents with each other and with their environment. The research foundation of swarm intelligence is constructed mostly upon two families of optimization algorithms i.e. ant colony optimization (Dorigo et at. dorigo1999ant and Colorni et al. colorni1992distributed 1992); and particle swarm optimization (PSO) (Kennedy & Eberhart kennedy1995particle 1995). Originally, the swarm intelligence is inspired by certain natural behaviors of flocks of birds and swarms of ants.

In the mid 1990s, particle swarm optimization technique was introduced for continuous optimization, motivated by flocking of birds. The evolution of PSO based bio-inspired techniques has been in an expedite development in the last two decades. It has got attention from different fields such as inventory planning wang2014modified , power systems alrashidi2009survey , manufacturing yildiz2009novel , communication networks latiff2007energy

, support vector machines


, to estimate binary inspiral signal

wang2010particle , gravitational waves normandin2018particle

and many more. Similar to evolutionary genetic algorithm, it is inspired by simulation of social behavior, where each individual is called particle, and group of individuals is called swarm. In multi-dimensional search space, the position and velocity of each particle represent a probable solution. Particles fly around in a search space seeking potential solution. At each iteration, each particle adjusts its position according to the goal of its own and its neighbors. Each particle in a neighborhood share the information with others

sun2004particle . Later, each particle keeps the record of best solution experienced so far to update their positions and adjust their velocities accordingly.

Figure 1: Particles movement in PSO and QPSO algorithm

Since the first PSO algorithm proposed, the several PSO algorithms have been introduced with plethora of alterations. Recently, the combination of quantum computing, mathematics and computer science have inspired the creation of optimization techniques. Initially, Narayanan and Moore narayanan1996quantum introduced quantum-inspired genetic algorithm (QGA) in 1995. Later, Sun et al. sun2004particle applied the quantum laws of mechanics to PSO and proposed quantum-inspired particle swarm optimization (QPSO). It is the commencement of quantum-behaved optimization algorithms, which has subsequently made a significant impact on the academic and research communities alike.

Recently, Yuanyuan and Xiyu yuanyuan2018quantum

proposed a quantum evolutionary algorithm to discover communities in complex social networks. Its applicability is tested on five real social networks and results are compared with classical algorithms. It has been proved that PSO lacks convergence on local optima i.e. it is tough for PSO to come out of the local optimum once it confines into optimal local region. QPSO with mutation operator (QPSO-MO) is proposed to enhance the diversity to escape from local optimum in search

liu2005quantum . Protopopescu and Barhen protopopescu2002solving

solved set of global optimization problems efficiently using quantum algorithms. In future, the proposed algorithm can be integrated with matrix product state based quantum classifier for supervised learning

40 ; 44 .

In this paper, we have combined QPSO with Cauchy mutation operator to add long jump ability for global search and natural selection mechanism for elimination of particles. The results shown that it has great tendency to overcome the problem of trapping into local search space. Therefore, the proposed hybrid QPSO strengthened the local and global search ability and outperformed the other variants of QPSO and PSO due to fast convergence feature.

The illustration of particles movement in PSO and QPSO algorithm is shown in Fig 1. The big circle at center denotes the particle with the global position and other circles are particles. The particles located away from global position are lagged particles. The blue color arrows signify the directions of other particles and the big red arrows point towards the side in which it goes with high probability. During iterations, if the lagged particle is unable to find better position as compared to present global position in PSO, then their impact is null on the other particles. But, in QPSO, the lagged particles move with higher probability in the direction of gbest position. Thus, the contribution of lagged particles is more to the solution in QPSO in comparison with PSO algorithm.

The organization of rest of this paper is as follows: Sect. 2 is devoted to prior work. In Sect. 3, the quantum particle swarm optimization is described. In Sect. 4, the proposed hybrid QPSO algorithm with Cauchy distribution and natural selection mechanism is presented. The experimental results are plotted for a set of benchmark problems and compared with several QPSO variants in Sect. 5. The correctness and time complexity are analyzed in Section 6. QPSO-CD is applied to three constrained engineering design problems in Sect. 7. Finally, Sect. 8 is the conclusion.

Ii Prior Work

Since the quantum-behaved particle swarm optimization was proposed, various revised variants have been emerged. Initially, Sun et al. sun2004particle applied the concept of quantum computing to PSO and developed a quantum Delta potential well model for classical PSO sun2004global . It has been shown that the convergence and performance of QPSO are superior as compared to classical PSO. The selection and control of parameters can improve its performance, which is posed as an open problem. Sun et al. sun2007using tested the performance of QPSO on constrained and unconstrained problems. It has been claimed that QPSO is a promising optimization algorithm, which performs better than classical PSO algorithms. In 2011, Sun et al. sun2011quantum

proposed QPSO with Gaussian distribution (GAQPSO) with the local attenuator point and compared its results with several PSO and QPSO counterparts. It has been proved that GAQPSO is efficient and stable with superior features in quality and robustness of solutions.

Further, Coelho dos2010gaussian

applied GQPSO to constrained engineering problems and shown that the simulation results of GQPSO are much closer to the perfect solution with small standard deviation. Li et al.

li2012improved presented a cooperative QPSO using Monte Carlo method (CQPSO), where particles cooperate with each other to enhance the performance of original algorithm. It is implemented on several representative functions and performed better than the other QPSO algorithms in context of computational cost and quality of solutions. Peng et al. introduced peng2013quantum

QPSO with Levy probability distribution and claimed that there are very less chances to be stuck in local optimum.

Researchers have applied PSO and QPSO to real-life problems and achieved optimal solutions as compared to existing algorithms. Ali et al. 90 performed energy-efficient clustering in mobile ad-hoc networks (MANET) with PSO. The similar approach can be followed to analyse and execute mobility over MANET with QPSO-CD 88 . Zhisheng zhisheng2010quantum used QPSO in economic load dispatch for power system and proved superior to other existing PSO optimization algorithms. Sun et al. sun2006qpso applied QPSO for QoS multicast routing. Firstly, the QoS multicast routing is converted into constrained integer problems and then effectively solved by QPSO with loop deletion task. Further, the performance is investigated on random network topologies. It has been proved that QPSO is more powerful than PSO and genetic algorithm. Geis and Middendorf proposed PSO with Helix structure for finding ribonucleic acid (RNA) secondary structures with same structure and low energy geis2011particle . The QPSO-CD algorithm can be used with two-way quantum finite automata to model the RNA secondary structures bhatia2018modeling . Bagheri et al. bagheri2014financial applied the QPSO for tuning the parameters of adaptive network-based fuzzy inference system (ANFIS) for forecasting the financial prices of future market. Davoodi et al. davoodi2014hybrid introduced a hybrid improved QPSO with Neldar Mead simplex method (IQPSO-NM), where NM method is used for tuning purpose of solutions. Further, the proposed algorithm is applied to solve load flow problems of power system and acquired the convergence accurately with efficient search ability. Omkar omkar2009quantum proposed QPSO for multi-objective design problems and results are compared with PSO. Recently, Fatemeh et al. fatemeh2019shuffled proposed QPSO with shuffled complex evolution (SP-QPSO) and its performance is demonstrated using five engineering design problems. Prithi and Sumathi prithi2020ld2fa integrated the concept of classical PSO with deterministic finite automata for transmission of data and intrusion detection. The proposed algorithm QPSO-CD can be used with quantum computational models for wireless communication 10 ; 20 ; 30 .

Iii Quantum Particle Swarm Optimization

Before we explain our hybrid QPSO-CD algorithm mutated with Cauchy operator and natural selection method, it is useful to define the notion of quantum PSO. We assume that the reader is familiar with the concept of classical PSO; otherwise, reader can refer to particle swarm optimization algorithm kennedy2010particle ; shi2001particle . The specific principle of quantum PSO is given as:

In QPSO, the state of a particle can be represented using wave function

. The probability density function

is used to determine the probability of particle occurring in position x at any time t sun2004particle ; sun2006qpso . The position of particles is updated according to equations:


where each particle must converge to its local attractor , where D is the dimension, N and M are the number of particles and iterations respectively, and denote the previous and optimal position vector of each particle respectively, , where ; are the acceleration coefficients, ; and u

are normally distributed random numbers in (0, 1),

is contraction-expansion coefficient and mbest defines the mean of best positions of particles as:


In Eq. (1), denotes contraction-expansion coefficient, which is setup manually to control the speed of convergence. It can be decreased linearly or fixed. In PSO, to ensure convergence performance of the particle. In QPSO-CD, the value of is determined by =1-(1.0-0.5)k/M, i.e. decreases linearly from 1.0 to 0.5 to attain good performance, where k is present iteration and M is maximum number of iterations.

Iv Hybrid Particle Swarm Optimization

The hybrid quantum-behaved PSO algorithm with Cauchy distribution and natural selection strategy (QPSO-CD) is described as follows:

The QPSO-CD algorithm begins with the standard QPSO using equations (1), (2) and (3). The position and velocity of particles cannot be determined exactly due to varying dynamic behavior. So, it can only be learned with the probability density function. Each particle can be mutated with Gaussian or Cauchy distribution. We mutated QPSO with Cauchy operator due to its ability to make larger perturbation. Therefore, there is a higher probability with Cauchy as compared to Gaussian distribution to come out of the local optima region. The QPSO algorithm is mutated with Cauchy distribution to increase its diversity, where mbest or global best position is mutated with fixed mutation probability (Pr). The probability density function of the standard Cauchy distribution is given as:


It should be noted that mutation operation is executed on each vector by adding Cauchy distribution random value (D(.)) independently such that


where is new location after mutated with random value to x. At last, the position of particle is selected and the particles of swarm are sorted on the basis of their fitness values after each iteration. Further, substitute the group of particles having worst fitness values with the best ones and optimal solution is determined. The main objective of using natural mechanism is to refine the capability and accuracy of QPSO algorithm.

1:The swarm is initialized with random numbers distributed uniformly: random .
3: decreases linearly from 1.0 to 0.5
4:for k=1 to M do do
6:     if  then
7:         Calculate the mbest of the swarm using Eq (3)
8:     end if
9:     for i=1: to N do do
10:         if Fitness Fitness then ;
11:              ;
12:         end if
13:         for j:1 to D do do
14:              ;
15:              ;
16:              ;
17:              if  then
19:              else
21:              end if
22:         end for
23:         =Fitness;
24:     end for
25:     ;
26:     ; S is selection parameter
27:     ; Sort the particles from best to worst position
28:end for
29:Until termination criterion is met
Algorithm 1 QPSO-CD algorithm

The natural selection method is used to enhance the convergence characteristics of proposed QPSO-CD algorithm, where the fitter solutions are used for the next iteration. The procedure of selection method for N particles is as follows:


where is position vector of particles at time t and is the fitness function of swarm. Next step is to sort the particles according to their fitness values from best one to worst position such that


In Algorithm 1, and are the sorting functions of fitness and position respectively. On the basis of natural selection parameters and fitness values, the positions of swarm particles are updated for the next iteration,


where , S denotes the selection parameter, Z signifies the number of best positions selected according to fitness values such that and is updated position vector of particles. The selection parameter S is generally set as 2 to replace the half of worst positions with the half of best positions of particles. It improves the precision of the direction of particles, protect the global searching capability and speed up the convergence.

V Experimental results

The performance of proposed QPSO-CD algorithm is investigated on representative benchmark functions, given in Table 1. Further, the results are compared with classical PSO (PSO), standard QPSO, QPSO with delta potential (QDPSO) and QPSO with mutation operator (QPSO-MO). The details of numerical benchmark functions are given in Table 1.

Test function Initial range
Sphere function
(-100, 100)
Rosenbrock function
(-5.12, 5.12)
Greiwank function
(-600, 600)
Rastrigrin function
(-5.12, 5.12)
Table 1: Details of benchmark functions

The performance of QPSO has been widely tested for various test functions. Initially, we have considered four representative benchmark functions to determine the reliability of QPSO-CD algorithm. For all the experiments, the size of population is 20, 40 and 80 and dimension sizes are 10, 20 and 30. The parameters for QPSO-CD algorithm are as follows: the value of decreases from 1.0 to 0.5 linearly, the natural selection parameter S=2 is taken, correlation coefficients are set equal to 2.

The mean best fitness values of PSO, QPSO, QDPSO, QPSO-MO and QPSO-CD are recorded for 1000, 1500 and 2000 runs of each function. Fig. 2 to Fig. 5 depict the performance of functions to with respect to mean best fitness against the number of iterations. In Table 2, P denotes the population, dimension is represented by D and G stands for generation. The numerical results of QPSO-CD shown optimal solution with fast convergence speed and high accuracy. The results shown that QPSO-CD performs better on Rosenbrock function than its counterparts in some cases. When the size of population is 20 and dimension is 30, the results of proposed algorithm are not better than QPSO-MO, but QPSO-CD performs better than PSO, QPSO and QDPSO. The performance of QPSO-CD is significantly better than its variants on Greiwank and Rastrigrin functions. It has outperformed other algorithms and obtained optimal solution (near zero) for Greiwank function. In most of the cases, QPSO-CD is more efficient and outperformed the other algorithms.

Figure 2: Effectiveness of QPSO-CD for sphere function Figure 3: Effectiveness of QPSO-CD for Rosenbrock function
Figure 4: Effectiveness of QPSO-CD for Greiwank function Figure 5: Effectiveness of QPSO-CD for Rastrigrin function
Sphere function Rosenbrock function
20 10 1000 0.0 4.01e-40 1.513e-49 1.508e-48 1.738e-50 95.10 58.41 14.22 22.18 34.67
20 1500 0.0 2.58e-21 1.339e-30 1.296e-31 1.032e-30 204.38 110.5 175.31 68.40 54.76
30 2000 0.0 2.08e-13 1.953e-21 1.918e-21 1.808e-21 314.46 148.5 242.37 113.30 122.5
40 10 1000 0.0 2.73e-67 1.087e-73 1.146e-51 1.154e-72 70.28 10.42 15.86 7.985 8.843
20 1500 0.0 4.84e-28 1.397e-42 1.417e-42 1.237e-41 178.98 48.45 112.46 52.93 41.77
30 2000 0.0 2.02e-25 2.850e-30 2.471e-28 1.946e-23 288.58 58.32 76.42 64.19 58.04
80 10 1000 0.0 7.66e-95 5.553e-90 4.872e-71 6.437e-72 36.29 8.853 36.34 5.715 7.419
20 1500 0.0 1.62e-60 1.654e-54 1.677e-58 1.609e-62 84.78 34.88 23.54 24.45 21.78
30 2000 0.0 2.05e-44 1.042e-40 1.131e-42 1.128e-41 202.58 52.17 70.81 45.22 40.97
Table 2: Comparison results of Sphere and Rosenbrock functions
Greiwank function Rastrigrin function
20 10 1000 0.089 0.078 0.1003 0.0732 0.072 5.526 5.349 4.969 4.478 4.051
20 1500 0.0300 0.2001 0.0086 0.0189 0.0078 23.17 21.28 17.08 15.63 13.22
30 2000 0.0181 0.0122 0.0544 0.0103 0.0026 46.29 32.57 48.61 27.80 31.48
40 10 1000 0.0826 0.055 0.048 0.0520 0.041 3.865 3.673 2.032 3.383 2.100
20 1500 0.0272 0.0149 0.0004 0.0247 0.0106 15.68 14.37 10.94 11.01 10.77
30 2000 0.0125 0.0117 0.0009 0.0105 0.0102 37.13 23.01 21.37 21.01 21.19
80 10 1000 0.0723 0.0341 0.0 0.0542 0.0702 2.562 2.234 0.923 2.183 1.943
20 1500 0.0274 0.0189 0.0 0.0194 0.0161 12.35 9.66 6.955 8.075 7.021
30 2000 0.0123 0.0118 0.0 0.0082 0.0031 26.89 17.48 18.13 14.99 11.73
Table 3: Comparison results of Greiwank and Rastrigrin functions

Vi Correctness and Time Complexity Analysis of a QPSO-CD Algorithm

In this Section, the correctness and time complexity of a proposed algorithm QPSO-CD is analyzed and compared with the classical PSO algorithm.

Theorem 1.

The sequence of random variables

generated by QPSO with Cauchy distribution converges to zero in probability as n approaches infinity.


Recall, the probability density function of standard Cauchy distribution and its convergence probability rudolph1997local are given as


Consider a random variable interpreted as

where denotes a fixed positive constant. Correspondingly, the probability density function can be calculated as

i.e. the probability density function of random variable .

Using Eq. (9), the probability density function of random variable becomes

This completes the proof of the theorem. ∎

Definition 1.

Let a random sequence of variables. It is converges to some random variable s with probability 1, if for every and , there exists such that or


The efficiency of the QPSO-CD algorithm is evaluated by number of steps needed to reach the optimal region . The method is to evaluate the distribution of number of steps needed to hit

by comparing the expected value and moments of distribution. The total number of stages to reach the optimal region is determined as

. The variance

and expectation value are determined as


In fact, the depends upon the convergence of . It is needed that , so that QPSO-CD can converge globally. The number of objective function evaluations are used to measure time. The main benefit of this approach is that it shows relationship between processor and measure time as the complexity of objective function increases. We used Sphere function with a linear constraint to compute the time complexity. It has minimum value at 0. The value of optimal region is set as To determine the time complexity, the algorithms PSO and QPSO-CD are executed 40 times on with initial scope [-10, 10], where N denotes the dimension. We determine the mean number of objective function evaluations (), the variance (), the standard deviation (SD) (

), the standard error (SE) (

) and ratio of mean and dimension (). The contraction coefficient is used for QPSO-CD and constriction coefficient for PSO with acceleration factors =2.25.

Dimension (N) Mean Variance SD SE Mean/N
2 302.38 4164.3 64.532 10.203 151.19
3 452.18 4541.9 67.394 10.655 150.72
4 621.29 5208.2 72.168 11.410 155.32
5 755.88 6675.0 81.701 12.918 151.17
6 879.13 8523.7 92.324 14.597 146.52
7 1022.06 9575.4 97.854 15.472 146.00
8 1158.52 10269.7 101.341 16.023 144.81
9 1308.17 12053.4 109.788 17.359 145.35
10 1459.3 12648.3 112.465 17.782 145.93
Table 4: Results of the time complexity for QPSO-CD algorithm
Dimension (N) Mean Variance SD SE Mean/N
2 691.4 17297.5 131.52 20.795 345.7
3 979.1 22281.5 149.27 23.601 326.3
4 1167.2 24282.9 155.72 24.638 291.8
5 1328.7 21853.7 147.83 23.373 265.7
6 1489.9 32008.7 178.91 28.288 248.3
7 1744.3 502297 224.12 35.436 249.1
8 1978.5 41233.3 203.06 31.106 247.3
9 2259.1 36217.8 190.31 30.090 251.0
10 2604.2 43559.8 208.71 32.999 260.4
Table 5: Results of the time complexity for PSO algorithm
Figure 6: Time complexity results for PSO and QPSO-CD
Figure 7: Comparison of Correlation coefficient of PSP and QPSO-CD

Table 1 and 2 show the statistical results of time complexity test for QPSO-CD and PSO algorithm, respectively. Fig 6 indicates that the time complexity of proposed algorithm increases non-linearly as the dimension increases. However, the time complexity of PSO algorithm increases adequately linearly. Thus, the time complexity of QPSO-CD is lower than PSO algorithm. In Fig 7, QPSO-CD shows a strong correlation between and N, i.e. the correlation coefficient is 0.9996. For PSO, the linear correlation coefficient is 0.9939, which is not so phenomenal as that in case of QPSO-CD. The relationship between mean and dimension clearly shows that the value of correlation coefficient is fairly stable for QPSO-CD as compared to PSO algorithm.

Vii QPSO-CD for Constraint Engineering Design Problems

There exists several approaches for handling constrained optimization problems. The basic principle is to convert the constrained optimization problem to unconstrained by combining objective function and penalty function approach. Further, minimize the newly formed objective function with any unconstrained algorithm. Generally, the constrained optimization problem can be described as in Eq. (13).

The objective is to minimize the objective function f(x) subjected to equality and inequality constrained functions, where is the upper bound and denotes the search space lower bound. The strict inequalities of form can be converted into and equality constraints can be converted into inequality constraints and . Sun et al. sun2007using

adopted non-stationary penalty function to address non-linear programming problems using QPSO. Coelho

dos2010gaussian used penalty function with some positive constant i.e. set to 5000. We adopted the same approach and replace the constant with dynamically allocated penalty value.


Usually, the procedure is to find the solution for design variables that lie in search space upper and lower bound constraints such that . If solution violates any of the constraint, then the following rules are applied


where rand[0, 1] is randomly distributed function to select value between 0 and 1. Finally, the unconstrained optimization problem is solved using dynamically modified penalty values according to inequality constraints . Thus, the objective function is evaluated as


where is the main objective function of optimization problem in Eq. (13), t is the iteration number and represents the dynamically allocated penalty value.

In this Section, QPSO-CD is tested for three-bar truss, tension/compression spring and pressure vessel design problems consisting different members and constraints. The performance of QPSO-CD is compared and analyzed with the results of PSO, QPSO, and SP-QPSO algorithms as reported in the literature.

vii.1 Three-bar truss design problem

Three-bar truss is a constraint design optimization problem, which has been widely used to test several methods. It consists cross-section areas of three bars (and ) and as design variables. The aim of this problem is to minimize the weight of truss subject to maximize the stress on these bars. The structure should be symmetric and subjected to two constant loadings as shown in Fig 8. The mathematical formulation of two design bars (, ) and three restrictive mathematical functions are described as:

Figure 8: Structure of Three-bar truss
0.78911058 0.788649 0.788796 0.788658
0.40702683 0.408322 0.407898 0.40828488
-6.6720e-06 1.6313e-07 6.4748e-06 9.00037e-06
-1.4655 -1.4640 -1.4644 -1.4640
-0.5345 -0.5359 -0.5354 -0.5359
263.89686 263.89584 263.89500 263.89465
Table 6: Comparison of optimal results for three-bar truss problem

The results are obtained by QPSO-CD are compared with its counterparts in Table 6. For three-bar truss problem, QPSO-CD is superior to optimal solutions previously obtained in literature. The difference of best solution obtained by QPSO-CD among other algorithms is shown in Fig 9.

Figure 9: Optimal results of PSO, QPSO, SP-QPSO and QPSO-CD algorithms for three bar truss problem

vii.2 Tension/Compression spring design problem

The main aim is to lessen the volume V of a spring subjected to tension load constantly as shown in Fig 10. Using the symmetry of structure, there are practically three design variables (), where is the wire diameter, the coil diameter is represented by and denotes the total number of active coils. The mathematical formulation for this problem is described as:

Figure 10: Structure of tension/compression spring
0.0516 0.0524 0.05 0.0513
0.3542 0.2505 0.25 0.2502
11.7942 2 2 2
-2.3006e-02 0.93145095 0.93034756 4.11004e-06
-5.6059e-03 -0.17471558 -0.16568318 -0.17352479
-3.9057 -50.67 -48.180 -49.561
-0.7294 -0.79986567 -0.80 -0.799
0.01305 0.00275 0.00250 0.00263
Table 7: Comparison of optimal results for tension spring design problem

It has been observed that QPSO algorithm with Cauchy distribution and natural selection strategy is robust and obtains optimal solutions than PSO and QPSO, shown in Table 7. The difference between best solutions found by QPSO-CD () and other algorithms for tension spring design problem are reported in Fig 11.

Figure 11: Results of PSO, QPSO, SP-QPSO and QPSO-CD methods for tension spring design problem

vii.3 Pressure vessel design problem

Initially, Kannan and Kramer kannan1994augmented studied the pressure vessel design problem with the main aim to reduce the total fabricating cost. Pressure vessels can be of any shape. For engineering purposes, a cylindrical design capped by hemispherical heads at both ends is widely used sandgren1990nonlinear . Fig 12 describes the structure of pressure vessel design problem. It consists four design variables (), where denotes the shell thickness , is used for head thickness (), denotes the inner radius (R) and represents the length of vessel (L). The objective function and constraint equations are described as:

Figure 12: Design of pressure vessel
0.8125 0.7783 0.7782 0.7776
0.4375 0.3849 0.3845 0.3848
42.0984 40.3289 40.3206 40.3278
176.6365 199.8899 199.9988 199.8865
-4.500e-15 4.777e-05 -1.242e-05 7.2654e-04
-0.035880 -1.62294e-04 1.58523e-04 -7.2787e-05
-1.164e-10 -97.39720071 -63.63686942 -0.734359
-63.3634 -40.1100 -40.0012 -40.1135
6059.714 5886.189 5885.268 5886.137
Table 8: Comparison of optimal results for Pressure vessel design problem
Figure 13: Optimal results of PSO, QPSO, SP-QPSO and QPSO-CD techniques for pressure vessel design problem

The optimal results of QPSO-CD is compared with the SP-QPSO, QPSO and PSO best results noted in the previous work, and are given in Table 8. The best solution obtained from QPSO-CD is better than other algorithms as shown in Fig 13.

Viii Conclusion

In this paper, a new hybrid quantum particle swarm optimization algorithm is proposed with natural selection method and Cauchy distribution. The performance of the proposed algorithm is experimented on four benchmark functions and the optimal results are compared with existing algorithms. Further, the QPSO-CD is applied to solve engineering design problems. The efficiency of QPSO-CD is successfully presented with superiority than preceding results for three engineering design problems: three-bar truss, tension/compression spring and pressure vessel. The efficiency of QPSO-CD algorithm is evaluated by number of steps needed to reach the optimal region and proved that time complexity of proposed algorithm is lower in comparison to classical PSO. In the context of convergence, the experimental outcomes shown that the QPSO-CD converge to get results closer to the superior solution.

Additional information

Competing interests: The authors declare no competing interests.


S.Z. acknowledges support in part from the National Natural Science Foundation of China (Nos. 61602532), the Natural Science Foundation of Guangdong Province of China (No. 2017A030313378), and the Science and Technology Program of Guangzhou City of China (No. 201707010194).


  • (1) N. Bohr, et al., The quantum postulate and the recent development of atomic theory, Vol. 3, Printed in Great Britain by R. & R. Clarke, Limited, 1928.
  • (2) H. P. Robertson, The uncertainty principle, Physical Review 34 (1) (1929) 163.
  • (3) L. Wessels, Schrödinger’s route to wave mechanics, Studies in History and Philosophy of Science Part A 10 (4) (1979) 311–340.
  • (4) R. P. Feynman, Simulating physics with computers, International journal of theoretical physics 21 (6-7) (1982) 467–488.
  • (5) J. Wang, Handbook of Finite State Based Models and Applications, CRC press, 2012.
  • (6)

    M. Dorigo, G. Di Caro, Ant colony optimization: a new meta-heuristic, in: Proceedings of the 1999 congress on evolutionary computation-CEC99 (Cat. No. 99TH8406), Vol. 2, IEEE, 1999, pp. 1470–1477.

  • (7) A. Colorni, M. Dorigo, V. Maniezzo, et al., Distributed optimization by ant colonies, in: Proceedings of the first European conference on artificial life, Vol. 142, Cambridge, MA, 1992, pp. 134–142.
  • (8)

    J. Kennedy, R. Eberhart, Particle swarm optimization (pso), in: Proc. IEEE International Conference on Neural Networks, Perth, Australia, 1995, pp. 1942–1948.

  • (9) S.-C. Wang, M.-F. Yeh, A modified particle swarm optimization for aggregate production planning, Expert Systems with Applications 41 (6) (2014) 3069–3077.
  • (10) M. R. AlRashidi, M. E. El-Hawary, A survey of particle swarm optimization applications in electric power systems, IEEE transactions on evolutionary computation 13 (4) (2009) 913–918.
  • (11) A. R. Yıldız, A novel particle swarm optimization approach for product design and manufacturing, The International Journal of Advanced Manufacturing Technology 40 (5-6) (2009) 617.
  • (12) N. A. Latiff, C. C. Tsimenidis, B. S. Sharif, Energy-aware clustering for wireless sensor networks using particle swarm optimization, in: 2007 IEEE 18th International Symposium on Personal, Indoor and Mobile Radio Communications, IEEE, 2007, pp. 1–5.
  • (13)

    S.-W. Lin, K.-C. Ying, S.-C. Chen, Z.-J. Lee, Particle swarm optimization for parameter determination and feature selection of support vector machines, Expert systems with applications 35 (4) (2008) 1817–1824.

  • (14) Y. Wang, S. D. Mohanty, Particle swarm optimization and gravitational wave data analysis: Performance on a binary inspiral testbed, Physical Review D 81 (6) (2010) 063002.
  • (15) M. E. Normandin, S. D. Mohanty, T. S. Weerathunga, Particle swarm optimization based search for gravitational waves from compact binary coalescences: Performance improvements, Physical Review D 98 (4) (2018) 044029.
  • (16) J. Sun, B. Feng, W. Xu, Particle swarm optimization with particles having quantum behavior, in: Proceedings of the 2004 congress on evolutionary computation (IEEE Cat. No. 04TH8753), Vol. 1, IEEE, 2004, pp. 325–331.
  • (17) A. Narayanan, M. Moore, Quantum-inspired genetic algorithms, in: Proceedings of IEEE international conference on evolutionary computation, IEEE, 1996, pp. 61–66.
  • (18) M. Yuanyuan, L. Xiyu, Quantum inspired evolutionary algorithm for community detection in complex networks, Physics Letters A 382 (34) (2018) 2305–2312.
  • (19)

    J. Liu, W. Xu, J. Sun, Quantum-behaved particle swarm optimization with mutation operator, in: 17th IEEE international conference on tools with artificial intelligence (ICTAI’05), IEEE, 2005, pp. 4–pp.

  • (20) V. Protopopescu, J. Barhen, Solving a class of continuous global optimization problems using quantum algorithms, Physics Letters A 296 (1) (2002) 9–14.
  • (21) A. S. Bhatia, M. K. Saggi, A. Kumar, S. Jain, Matrix product state–based quantum classifier, Neural computation 31 (7) (2019) 1499–1517.
  • (22) A. S. Bhatia, M. K. Saggi, Implementing entangled states on a quantum computer, arXiv preprint: 1811.09833 (2018).
  • (23) J. Sun, W. Xu, B. Feng, A global search strategy of quantum-behaved particle swarm optimization, in: IEEE Conference on Cybernetics and Intelligent Systems, 2004., Vol. 1, IEEE, 2004, pp. 111–116.
  • (24) J. Sun, J. Liu, W. Xu, Using quantum-behaved particle swarm optimization algorithm to solve non-linear programming problems, International Journal of Computer Mathematics 84 (2) (2007) 261–272.
  • (25) J. Sun, W. Fang, V. Palade, X. Wu, W. Xu, Quantum-behaved particle swarm optimization with gaussian distributed local attractor point, Applied Mathematics and Computation 218 (7) (2011) 3763–3775.
  • (26) L. dos Santos Coelho, Gaussian quantum-behaved particle swarm optimization approaches for constrained engineering design problems, Expert Systems with Applications 37 (2) (2010) 1676–1683.
  • (27) Y. Li, R. Xiang, L. Jiao, R. Liu, An improved cooperative quantum-behaved particle swarm optimization, Soft Computing 16 (6) (2012) 1061–1069.
  • (28) Y. Peng, Y. Xiang, Y. Zhong, Quantum-behaved particle swarm optimization algorithm with lévy mutated global best position, in: 2013 Fourth International Conference on Intelligent Control and Information Processing (ICICIP), IEEE, 2013, pp. 529–534.
  • (29) H. Ali, W. Shahzad, F. A. Khan, Energy-efficient clustering in mobile ad-hoc networks using multi-objective particle swarm optimization, Applied Soft Computing 12 (7) (2012) 1913–1928.
  • (30) A. S. Bhatia, R. K. Cheema, Analysing and implementing the mobility over manets using random way point model, International Journal of Computer Applications 68 (17) (2013) 32–36.
  • (31) Z. Zhisheng, Quantum-behaved particle swarm optimization algorithm for economic load dispatch of power system, Expert Systems with Applications 37 (2) (2010) 1800–1803.
  • (32) J. Sun, J. Liu, W. Xu, Qpso-based qos multicast routing algorithm, in: Asia-Pacific Conference on Simulated Evolution and Learning, Springer, 2006, pp. 261–268.
  • (33) M. Geis, M. Middendorf, Particle swarm optimization for finding RNA secondary structures, International Journal of Intelligent Computing and Cybernetics (2011).
  • (34) A. S. Bhatia, A. Kumar, Modeling of RNA secondary structures using two-way quantum finite automata, Chaos, Solitons & Fractals 116 (2018) 332–339.
  • (35) A. Bagheri, H. M. Peyhani, M. Akbari, Financial forecasting using anfis networks with quantum-behaved particle swarm optimization, Expert Systems with Applications 41 (14) (2014) 6235–6250.
  • (36) E. Davoodi, M. T. Hagh, S. G. Zadeh, A hybrid improved quantum-behaved particle swarm optimization–simplex method (iqpsos) to solve power system load flow problems, Applied Soft Computing 21 (2014) 171–179.
  • (37) S. Omkar, R. Khandelwal, T. Ananth, G. N. Naik, S. Gopalakrishnan, Quantum behaved particle swarm optimization (qpso) for multi-objective design optimization of composite structures, Expert Systems with Applications 36 (8) (2009) 11312–11322.
  • (38) D. Fatemeh, C. Loo, G. Kanagaraj, Shuffled complex evolution based quantum particle swarm optimization algorithm for mechanical design optimization problems, Journal of Modern Manufacturing Systems and Technology 2 (1) (2019) 23–32.
  • (39) S. Prithi, S. Sumathi, Ld2fa-pso: A novel learning dynamic deterministic finite automata with pso algorithm for secured energy efficient routing in wireless sensor network, Ad Hoc Networks 97 (2020) 102024.
  • (40) A. Ambainis, R. Freivalds, 1-way quantum finite automata: strengths, weaknesses and generalizations, in: Proceedings 39th Annual Symposium on Foundations of Computer Science (Cat. No. 98CB36280), IEEE, 1998, pp. 332–341.
  • (41) A. S. Bhatia, A. Kumar, Quantum finite automata: survey, status and research directions, arXiv preprint:1901.07992 (2019).
  • (42) A. S. Bhatia, A. Kumar, On the power of two-way multihead quantum finite automata, RAIRO-Theoretical Informatics and Applications 53 (1-2) (2019) 19–35.
  • (43)

    J. Kennedy, Particle swarm optimization, Encyclopedia of machine learning (2010) 760–766.

  • (44) Y. Shi, et al., Particle swarm optimization: developments, applications and resources, in: Proceedings of the 2001 congress on evolutionary computation (IEEE Cat. No. 01TH8546), Vol. 1, IEEE, 2001, pp. 81–86.
  • (45) G. Rudolph, Local convergence rates of simple evolutionary algorithms with cauchy mutations, IEEE Transactions on Evolutionary Computation 1 (4) (1997) 249–258.
  • (46) B. Kannan, S. N. Kramer, An augmented lagrange multiplier based method for mixed integer discrete continuous optimization and its applications to mechanical design, Journal of mechanical design 116 (2) (1994) 405–411.
  • (47) E. Sandgren, Nonlinear integer and discrete programming in mechanical design optimization, Journal of Mechanical Design 112 (2) (1990) 223–229.