1 Introduction
Optimization problems in the real world often contain different types of constraints. A constrained optimization problem (COP) can be formulated by the following mathematical form:
(1)  
(2) 
where is a bounded domain in , given by , and denote lower and upper boundaries respectively. is the th inequality constraint while the th equality constraint.
There exist a variety of evolutionary algorithms (EAs) for solving COPs, which employ different constraint handling techniques, such as the penalty function method, feasibility rule, repair method and multiobjective optimization michalewicz1996evolutionary ; coello2002theoretical ; mezura2011constraint . This paper focuses on the multiobjective optimization method segura2016using . Its idea is to convert a singleobjective COP into a multiobjective optimization problem (MOP) without a constraint. The converted MOP often is a twoobjective optimization problem surry1997comoga in which one object is the original objective function and the other is the degree function violating the constraints zhou2003multi :
(3) 
where is the original objective function and is the degree of constraint violation. is defined by the sum of constraint violation degrees:
(4) 
The first part in the formula is the sum of the degree of violating an inequality constraint, given by
(5) 
The second part is the sum of the degree of violating an equal constraint, given by
(6) 
where is a tolerance allowed for the equality constraint.
The idea of applying multiobjective evolutionary algorithms (MOEAs) to COPs have attracted researchers’ interest in last two decades. Surry and Radcliff surry1997comoga
proposed constrained optimization by multiobjective genetic algorithms. They considered a COP in a dual perspective, as a constraint satisfaction problem and an unconstrained optimization problem. Coello
coello2000constraint introduced the concept of nondominance to handle constraints into the fitness function of a genetic algorithm. Feasible individuals are ranked higher than infeasible ones, while infeasible individuals with a lower degree of constraint violation is ranked higher than those with a higher degree. Zhou et al. zhou2003multi converts a COP to a twoobjective optimization model: the original objective function and the degree function violating the constraints. Then they designed a realcoded genetic algorithm based on Pareto strength and Minimal Generation Gap model. Venkatraman and Yen venkatraman2005generic proposed a twophase genetic algorithm framework for solving COPs. In the first phase, a COP is treated as a constraint satisfaction problem. In the second phase, a COP is treated as a biobjective optimization problem with the simultaneous optimization of the objective function and the satisfaction of the constraints. Then the NonDominated Sorting Genetic Algorithm (NSGAII) is used. Cai and Wang cai2006multiobjective ; wang2012combining combined multiobjective optimization with differential evolution (CMODE) to solve COPs which is based on the twoobjective model. The search is guided by infeasible solution archiving and replacement mechanism. Furthermore, they provided a dynamic hybrid framework wang2012dynamic , which consists of global search and local search models. More recently, Gao and Yen et al. gao2015dual considered COPs as a biobjective optimization problem, where the first objective is the reward function or actual cost to be optimized, while the second objective is the constraint violations degree. Gao et al. gao2015multi proposed a reverse comparison strategy based on multiobjective dominance concept. That strategy converted the original COPs to MOPs with one constraint, and weeds out worse solutions with smaller fitness value regardless of its constraints violation. Xu et al. xu2017new considered a new MOP which is composed of the objective function, the sum of the degrees of constraint violation and also the weighted sums of the normalized objective function and normalized degrees of constraint violation.Among MOEAs for solving COPs, CMODE cai2006multiobjective ; wang2012combining is one of the most efficient methods. The purpose of this paper aims to improve its performance. The main novelty in this paper is to construct a new search operator based on principle component analysis (PCA) and replace the normal crossover used in CMODE wang2012combining . As a result, a PCAbased multiobjective optimization differential evolution algorithm (PMODE) is proposed. In order to evaluate the performance of the new algorithm, twentyfour test functions are used in a comparative experiments. Experimental results indicate that PMODE can achieve an overall superior performance comparing to CMODE wang2012combining .
The remainder of this paper is organized as follows. Section 2 introduces related work in differential evolution (DE), CMODE and PCA’s applications in EAs. Section 3 explains the proposed main work in details. Section 4 gives experimental results and performance comparison. Section 5 concludes this paper.
2 Background
Our work is built upon three aspects: classical DE storn1997differential , CMODE wang2012combining and applications of PCA in EAs munteanu1999improving . This section reviews them one by one.
2.1 Classical Differential Evolution
DE is a popular EA for solving continuous optimization problems storn1997differential . In DE, a population is represented by
dimensional vectors:
(7)  
(8) 
where represents the generation counter. is the population size. The initial individuals are chosen randomly from . An initial individual is generated at random as follows:
(9) 
where is the random number .
The DE algorithm consists of three operations: mutation, crossover and selection, which are described as follows storn1997differential ; xu2017new .
 DE Mutation:

for each individual where a mutant vector is generated by
(10) where random indexes are mutually different integers. They are also chosen to be different from the running index . is a real and constant factor from which controls the amplification of the differential variation . In case is out of the interval , the mutation operation is repeated until falls in .
 DE Crossover:

in order to increase population diversity, crossover is also used in DE. The trial vector is generated by mixing the target vector with the mutant vector . Trial vector is constructed as follows:
(11) where is a uniform random number from . Index is randomly chosen from . denotes the crossover constant which has to be determined by the user. In addition, the condition “” is used to ensure the trial vector gets at least one parameter from vector .
 DESelection:

a greedy criterion is used to decide whether the offspring generated by mutation and crossover should replace its parent. Trail vector is compared to target vector , then the better one will be reserved to the next generation.
There exist several variants of DE algorithms. The DE used in our study is the DE/Rand/1/bin DE storn1999system which is illustrated below.
2.2 Multiobjective optimization differential evolution for COPs
Given the MOP converted from a COP,
(12) 
Although normal MOEAs can be applied to solving the above MOP, they are not so efficient because the target of COPs is not a Pareto front, instead only a single point or several points. Therefore problemspecific MOEAs seems more efficient for solving COP. Among those problemspecific MOEAs, CMODE designed by Wang and Cai wang2012combining is one of the most efficient. The procedure of CMODE is described as as below.
The algorithm is explained stepbystep in the following. At the beginning, an initial population is chosen at random, where all initial vectors are chosen randomly from .
At each generation, the parent population is split into two groups: one group with parent individuals that are used for DE operations (set ) and the other group (set ) with individuals that are not involved in DE operations. DE operations are applied to selected children (set ) and then generate children (set ).
Selection is based on the dominance relation. First nondominated individuals (set ) are identified from the children population . Then these individual(s) will replace the dominated individuals in (if exists). As a result, the set is updated. The set is merged with those parent individuals that are involved in DE operation (the set ) together and then the next parent population is formed. The procedure repeats until reaching the maximum number of evaluations. The output is the best found solution by DE.
The infeasible solution replacement mechanism is that, provided that a children population is composed of only infeasible individuals, the “best” child, who has the lowest degree of constraint violation, is stored into an archive. After a fixed interval of generations, some randomly selected infeasible individuals in the archive will replace the same number of randomly selected individuals in the parent population.
2.3 Application of Principle Component Analysis in Evolutionary algorithms
PCA is a wellknown statistical method widely used in data analysis barber2012bayesian
. Its main goal is to compress a highdimensional data into a lower dimensional space. It is an interesting idea to apply PCA to the design of EAs but so far only a few research papers can be found on this topic. Munteanu and Lazarescu’s work
munteanu1999improving designed a mutation operator based on PCA. They claimed that a PCAmutation genetic algorithm (GA) is more successful in maintaining population diversity during search. Their experimental results show that a GA with the PCAmutation obtained better solutions compared to solutions found using GAs with classical mutation operators for a filter design problem.Munteanu and Lazarescu munteanu1999improving designed a new mutation operator on a projection search space generated by PCA, rather than the original space. Their PCA mutation is described as follows. A population with individuals is represented by an matrix where is the space dimension and the population size. Each is an individual represented by a column vector.
(13) 
Given the covariance matrix
, compute its eigenvectors
and sort them in the order of the corresponding eigenvalues of these eigenvectors from high to low. Form a
matrix .(14) 
(15) 
(16) 
(17) 
Notice that the above PCAmutation doesn’t reduce the data set into a lower dimension space, instead and have the same dimension. This PCAmutation aims to conduct mutation in the projection space rather than the original space. However the dimensions of the projection space and original space are the same.
PCA is also used to improve the efficiency of particle swarm optimization (PSO)
zhao2014enhanced . The search direction in PSO is a linear combination among its present status, historical best experience and the swarm best experience, but this strategy is inefficient when searching in a complex space. Then a new PCAbased search mechanism (PCAPSO) is proposed in zhao2014enhanced in which PCA is mainly used to efficiently mine population information for the promising principal component directions and then a local search strategy is utilized on them. Their experimental results show that PCAPSO outperforms some PSO variants and is competitive for other stateoftheart algorithms.3 PCAbased Multiobjective Optimization Differential Evolution
The performance of an EA is linked to whether its search operators work efficiently on a fitness landscape. In this section we design a new PCAprojection operator for searching the valley landscape and then propose new PCAbased multiobjective optimization differential evolution (PMODE) for COPs.
3.1 Analysis of Principle Component and Valley Direction
Although the PCAmutation operator proposed in munteanu1999improving was efficient for a filter design problem, it has one disadvantage. The PCAmutation still acts on the same dimension space as the original search space. Thus, as the population size increases, the calculation of eigenvalues and eigenvectors in PCA becomes more and more expensive. In this paper, we propose a simple PCAsearch operator in which PCA is only applied to several selected points. The research question is how to select points from a population for implementing PCA? The solution relies on the valley concept.
In the 3dimensional space on Earth, a valley is intuitive which means a low area between two hills or mountains. However, this definition is really fuzzy. What does a valley in a higher dimensional space mean? How to identify the location of a valley? So far there exist no clear mathematical definition about the valley. In this paper, we study the valley landscape using PCA and find that PCA provides a statistic method of identifying the valley direction.
Let’s explain our idea using the wellknown Rosenbrock function:
(18) 
Its minimum point is at with . Fig. (a)a shows the contour graph of Rosenbrock function. From Fig. (a)a, it is obvious that a deep valley exists on this landscape. But how to identify the valley?
In the following we show a statistical method of calculating the valley direction. First we sample 20 points at random and select 6 points with smallest function values from the population. Fig. (b)b depicts that these 6 points (labeled by squared points) are closer to the valley than other points.
Next we identify the valley direction. Since the selected 6 points distribute along the valley, the valley direction can be regarded as a direction along which the variance of the 6 points is maximal. This direction can be identified by PCA. Assume that the valley direction is a linear line, the valley in fact can be approximated by the first principle component found by PCA. Let’s project the 6 selected points onto the first principle component. Fig. (c)c shows that the projected points (labeled by dotted points) approximately represent the valley direction.
But it should be pointed out if we apply PCA to the whole population and project all points onto the first principle component, we cannot obtain the valley direction. Fig. (d)d shows that the mapped points (labeled by dotted points)) don’t distribute along the valley direction. The mapped points could represent any direction because the 20 points are generated at random.
3.2 Proposed PCA Projection
Based on the discovery in the above subsection, we design a new PCA search operator. Here is our idea: Given a population, we select a group of points with smaller function values from the population; apply PCA barber2012bayesian to calculate principle components; then project the points onto the principle components; at the end reconstruct the projected points in the original search space and these points are taken as the children. The procedure is described in detail as follows:
 PCAprojection:

Given a population and a fitness function ,
1:Select individuals with smaller fitness values from the population (for PMODE in the next subsection, select individuals from the best half of the population). Denote these individuals by .2:Calculate the mean vector and covariance matrix :(19) 3:Calculate the eigenvectors of the covariance matrix , sorted them so that the eigenvalues of is larger than for . Form a matrix where . For PMODE in the next subsection, , that is the first principle component.4:Project onto the lowerdimensional space:(20) 5:Reconstruct the projected point in the original space:(21)
We call the search operator PCAprojection, rather than PCAmutation munteanu1999improving , because there is no mutation step as PCAmutation munteanu1999improving .
Compared with PCAmutation in munteanu1999improving , the PCAprojection has three new features:

The computation of our PCAprojection is much lighter than PCAmutation in munteanu1999improving . Our PCAprojection is applied to only selected good points from the population. For example, in PMODE which is a small number.

The PCAprojection has an intuitive explanation. It can project an individual to a new position along the valley direction for a valley landscape.

It also takes the advantage of compressing a higher dimensional data into a lower dimension space. For example, in PMODE the projected space is 1dimentional (the first principle component). This probably makes the search faster.
3.3 PCAbased Multiobjective Optimization Differential Evolution
With the proposed PCAprojection, PMODE was developed based on the framework of CMODE described in Section 2.2. Although the structure of PMODE is similar to CMODE, they are two essentially different EAs. PMODE employs DEmutation and PCAprojection but without crossover, while CMODE uses DEmutation and DEcrossover. The pseudocode of the PMODE is shown as below:
Steps 14 are initialization steps. At the beginning, an initial population is chosen at random, where all initial vectors are chosen randomly from .
Steps 516 evolve a population. At each generation, the parent population is split into two groups: one group with parent individuals that are used for DE mutation and PCAprojection (set ) while the other group (set ) with individuals that are not involved in these operations. DE mutation and PCAprojection are applied to selected children (set ) and then generate children (set ). The PCAprojection is realized with the aid of the PCA technique. The input matrix coming from the individuals (denoted by ), which is to implement the PCAprojection with a very small probability. Since the probability of applying PCAproject is very small ( in our experiments), this operation doesn’t increase too much computation. On the other hand, DEcrossover is removed from PMODE, so the search is mainly determined by DEmutation plus PCAprojection. This makes the search operators in PMODE essentially different from CMODE. Selection is based on the dominance relation which is the same as CMODE.
4 Experimental Study
4.1 Experimental settings
In order to evaluate the performance of PMODE, 24 benchmark functions are used in our experiments. These benchmark functions were provided by the Special Session and Competition on Constrained RealParameter Optimization in 2006 IEEE Congress on Evolutionary Computation ^{1}^{1}1http://www.ntu.edu.sg/home/EPNSugan/index_files/CEC06/CEC06.htm. Accessed on March 16 2018 (thereafter abbreviated by CEC 2006 Competition). There are 5 types of functions, namely quadratic, polynomial, cubic, linear, and nonlinear. Table 1 describes the details of these benchmark test functions. is the number of decision variables,
is the estimated ratio between the feasible region and the search space, and
is the objective function value of the best known solution.Function  n  Type  
g01  13  Quadratic  0.0111%  15.00000000 
g02  20  Nonlinear  99.9971%  0.8036191041 
g03  10  Polynomial  0.0000%  1.0005001000 
g04  5  Quadratic  51.1230%  30665.5386717833 
g05  4  Cubic  0.0000%  5126.4967140071 
g06  2  Cubic  0.0066%  6961.8138755802 
g07  10  Quadratic  0.0003%  24.3062090682 
g08  2  Nonlinear  0.8560%  0.0958250414 
g09  7  Polynomial  0.5121%  680.6300573744 
g10  8  Linear  0.0010%  7049.2480205287 
g11  2  Quadratic  0.0000%  0.7499000000 
g12  3  Quadratic  4.7713%  1.0000000000 
g13  5  Nonlinear  0.0000%  0.0539415140 
g14  10  Nonlinear  0.0000%  47.7648884595 
g15  3  Quadratic  0.0000%  961.7150222900 
g16  5  Nonlinear  0.0204%  1.9051552585 
g17  6  Nonlinear  0.0000%  8853.5338748065 
g18  9  Quadratic  0.0000%  0.8660254038 
g19  15  Nonlinear  33.4761%  32.6555929502 
g20  24  Linear  0.0000%  0.2049794002 
g21  7  Linear  0.0000%  193.7245100697 
g22  22  Linear  0.0000%  236.4309755040 
g23  9  Linear  0.0000%  400.055100000 
g24  2  Linear  79.6556%  5.5080132716 
It must be mentioned that an improved solution for the test function g17 is used in the above table, which is a little bit better than that in CEC 2006 Competition. in the competition for g17 is = (201.784467214524, 100, 383.071034852773, 420, −10.907658451429, 0.073148231208) with = 8853.53967480648. The improved solution used in this paper for test function g17 listed in Table 1 is = (201.784462493550, 100, 383.071034852773, 420, −10.907665625756, 0.073148231208) with = 8853.533874806484.
There are mainly five parameters in the design of these PMODE: the population size (), the scaling factor () and the PCAprojection probability (), the parameters for set Q ( and ). is set as 0.004. The values of other parameters follow the settings in wang2012combining : is set as 180, is randomly chosen between 0.5 and 0.6, and , .
For each algorithm, independent runs were implemented for each benchmark test function within a maximum fitness evaluations FES . The tolerance value for the equality constraints was set to
. As suggested by CEC 2006 Competition, the best, median, worst, mean, and standard deviation of the error value
for the bestsofar solution after FES , FES , and FES in each run are recorded in Tables 23. The numbers in the parentheses behind the error distance values of the best, median, and worst solutions represent the number of unsatisfied constraints at the best, median, and worst solutions, respectively.4.2 General Performance of the Proposed Algorithm
As shown in Tables 23, feasible solutions can always be found for 12 of 24 benchmark functions that are g01, g02, g04, g06, g07, g08, g09, g10, g12, g16, g19 and g24 within FES. In FES, feasible solutions can be found in every run for all benchmark functions apart from g20 and g22. g20 and g22 are very difficult for PMODE to solve because they are still far away from feasible region until FES. However, within FES, feasible solutions can be consistently found in all other 20 benchmark functions. Additionally, very close or equal to best known solution can be found in g01, g08, g10, g11, g12, g14, g16, g18, g19 and g24 in all runs, even better than best known solutions (shown as negative value) can always be found in g03, g04, g05, g06, g07, g09, g13, g15, g17 and g23. The result of the rest two benchmark functions g02 and g21 can also arrive at best known solutions in most runs.
FES  g01  g02  g03  g04  
Best  3.7476E+00 (0)  3.9521E01 (0)  8.1970E01 (0)  8.0248E+01 (0)  
Median  7.1483E+00 (0)  4.6885E01 (0)  9.9764E01 (0)  1.3409E+02 (0)  
Worst  8.3093E+00 (0)  5.8431E01 (0)  1.0004E+00 (1)  2.1045E+02 (0)  
Mean  6.5899E+00  4.7402E01  9.6788E01  1.4122E+02  
Std  1.4271E+00  5.6480E02  6.0624E02  3.6330E+01  
Best  8.7244E02 (0)  1.4509E01 (0)  2.6009E05 (0)  5.3865E03 (0)  
Median  2.1775E01 (0)  2.4452E01 (0)  3.6454E04 (0)  1.9360E02 (0)  
Worst  6.5961E01 (0)  2.9740E01 (0)  2.8050E03 (0)  6.3210E02 (0)  
Mean  2.5840E01  2.4635E01  4.3464E04  2.3514E02  
Std  1.3474E01  3.2326E02  5.4310E04  1.5553E02  
Best  0.0000E+00 (0)  1.4432E15 (0)  2.8865E15 (0)  3.6379E12 (0)  
Median  4.6185E14 (0)  1.6653E15 (0)  2.6645E15 (0)  3.6379E12 (0)  
Worst  1.8172E12 (0)  8.7220E03 (0)  2.6645E15 (0)  3.6379E12 (0)  
Mean  1.7209E13  6.7092E04  2.7622E15  3.6379E12  
Std  3.7931E13  2.3701E03  1.1249E16  0.0000E+00  
FES  g05  g06  g07  g08  
Best  1.3619E+01 (0)  9.7204E+00 (0)  4.1846E+01 (0)  7.7709E06 (0)  
Median  7.0744E+00 (2)  3.6650E+01 (0)  6.6091E+01 (0)  2.1193E04 (0)  
Worst  9.6255E+01 (3)  7.5112E+01 (0)  1.1750E+02 (0)  1.0131E03 (0)  
Mean  5.1398E+01  3.8587E+01  7.0290E+01  2.9866E04  
Std  1.1228E+02  1.7128E+01  2.2190E+01  2.8676E04  
Best  6.5827E08 (0)  1.7270E07 (0)  1.1916E01 (0)  1.5668E14 (0)  
Median  2.8887E07 (0)  1.9594E06 (0)  1.7154E01 (0)  1.0883E08 (0)  
Worst  1.0345E06 (0)  1.3193E05 (0)  3.2815E01 (0)  6.2483E07 (0)  
Mean  3.4680E07  2.7423E06  1.8937E01  7.4641E08  
Std  2.0430E07  2.6518E06  4.7690E02  1.3977E07  
Best  1.8189E12 (0)  1.6370E11 (0)  2.3803E13 (0)  2.7755E17 (0)  
Median  1.8189E12 (0)  1.6370E11 (0)  2.2737E13 (0)  4.1633E17 (0)  
Worst  1.8189E12 (0)  1.6370E11 (0)  2.1671E13 (0)  4.1633E17 (0)  
Mean  1.8189E12  1.6370E11  2.2851E13  4.1078E17  
Std  0.0000E+00  0.0000E+00  4.6683E15  2.7755E18  
FES  g09  g10  g11  g12  
Best  2.3934E+01 (0)  4.1409E+03 (0)  2.8214E05 (0)  2.1640E05 (0)  
Median  4.9688E+01 (0)  5.9270E+03 (0  3.3153E04 (0)  9.2278E05 (0)  
Worst  8.6439E+01 (0)  1.1352E+04 (0)  2.6914E03 (1)  3.9859E04 (0)  
Mean  5.2651E+01  6.3646E+03  3.8302E03  1.2505E04  
Std  1.7102E+01  1.7857E+03  1.2484E02  9.1642E05  
Best  1.8563E04 (0)  7.2400E+00 (0)  9.4873E11 (0)  0.0000E+00 (0)  
Median  7.3393E04 (0)  1.1578E+01 (0)  4.4319E10 (0)  0.0000E+00 (0)  
Worst  2.5274E03 (0)  2.1756E+01 (0)  3.2906E09 (0)  0.0000E+00 (0)  
Mean  8.0712E03  1.2072E+01  7.9941E10  0.0000E+00  
Std  5.5073E04  3.2711E+00  8.2203E10  0.0000E+00  
Best  2.2737E13 (0)  7.2759E12 (0)  0.0000E+00 (0)  0.0000E+00 (0)  
Median  2.2737E13 (0)  7.2759E12 (0)  0.0000E+00 (0)  0.0000E+00 (0)  
Worst  1.1368E13 (0)  7.2759E12 (0)  0.0000E+00 (0)  0.0000E+00 (0)  
Mean  2.0463E13  7.2759E12  0.0000E+00  0.0000E+00  
Std  4.6412E14  0.0000E+00  0.0000E+00  0.0000E+00 
FES  g13  g14  g15  g16  
Best  9.2923E01 (0)  2.0579E+02 (3)  1.1259E02 (0)  5.0879E02 (0)  
Median  7.3286E01 (2)  1.2254E+02 (3)  1.1329E01 (1)  9.8467E02 (0)  
Worst  8.2007E01 (3)  3.7664E+01 (3)  7.5665E01 (2)  2.1830E01 (0)  
Mean  6.6186E01  1.2076E+02  4.7561E01  1.0361E01  
Std  3.2237E01  3.7009E+01  5.5578E01  3.4454E02  
Best  7.1483E09 (0)  1.2165E02 (0)  4.1154E11 (0)  7.9541E07 (0)  
Median  3.8363E08 (0)  5.7221E02 (0)  2.0634E10 (0)  1.3600E06 (0)  
Worst  3.6771E07 (0)  3.0697E01 (0)  1.4682E09 (0)  3.2443E06 (0)  
Mean  7.2797E08  8.8248E02  3.3435E10  1.6191E06  
Std  9.0395E08  7.4330E02  3.4142E10  6.9377E07  
Best  2.4286E16 (0)  1.4210E14 (0)  1.1368E13 (0)  3.7747E15 (0)  
Median  2.2204E16 (0)  1.4210E14 (0)  1.1368E13 (0)  3.7747E15 (0)  
Worst  1.9428E16 (0)  2.1316E14 (0)  1.1368E13 (0)  3.7747E15 (0)  
Mean  2.1954E16  1.4779E14  1.1368E13  3.7747E15  
Std  1.0385E17  1.9674E15  0.0000E+00  0.0000E+00  
FES  g17  g18  g19  g20  
Best  2.1463E+02 (0)  6.7675E01 (0)  1.2932E+02 (0)  1.2534E+01 (14)  
Median  1.0605E+02 (2)  8.7304E01 (2)  3.0020E+02 (0)  1.0197E+01 (16)  
Worst  9.7101E+02 (3)  1.6762E01 (5)  4.0959E+02 (0)  9.3376E+00 (19)  
Mean  1.2074E+02  7.5218E01  2.8502E+02  1.0683E+01  
Std  1.3696E+02  1.8435E01  7.4689E+01  1.8991E+00  
Best  1.0381E03 (0)  1.5288E03 (0)  2.7929E+00 (0)  8.6285E01 (14)  
Median  3.5125E03 (0)  3.5573E03 (0)  4.9162E+00 (0)  1.7833E+00 (16)  
Worst  2.8967E+00 (0)  6.2765E03 (0)  1.0079E+01 (0)  5.4970E01 (19)  
Mean  6.4163E01  3.6941E03  5.3814E+00  8.6565E01  
Std  9.4554E01  1.3026E03  1.7217E+00  4.1305E01  
Best  1.8189E12 (0)  2.2204E16 (0)  5.7661E10 (0)  7.9138E02 (10)  
Median  1.8189E12 (0)  2.2204E16 (0)  3.2166E09 (0)  8.3353E02 (15)  
Worst  1.8189E12 (0)  2.2204E16 (0)  1.3770E09 (0)  7.3211E02 (17)  
Mean  1.8189E12  2.2204E16  3.9057E09  8.4100E02  
Std  8.2871E25  0.0000E+00  3.0780E09  2.5118E02  
FES  g21  g22  g23  g24  
Best  1.4412E+01 (1)  6.4219E+03 (4)  4.6027E+02 (1)  1.7725E03 (0)  
Median  6.0401E+02 (2)  4.1341E+03 (7)  1.6899E+02 (3)  7.4986E03 (0)  
Worst  1.6334E+02 (2)  2.0828E+03 (13)  3.1643E+02 (5)  1.6723E02 (0)  
Mean  1.7658E+02  6.6432E+03  2.1998E+02  7.8850E03  
Std  1.7532E+02  5.5802E+03  4.1690E+02  3.9429E03  
Best  1.4102E02 (0)  2.3478E+02 (6)  9.2163E+00 (0)  7.8120E09 (0)  
Median  4.4718E02 (0)  1.9485E+02 (9)  2.1680E+01 (0)  1.1290E07 (0)  
Worst  1.3106E+02 (0)  2.2276E+02 (13)  4.9589E+01 (0)  5.9857E07 (0)  
Mean  1.3327E+01  2.2896E+02  2.3894E+01  1.3876E07  
Std  3.7031E+01  1.1599E+01  1.0062E+01  1.4024E07  
Best  3.0561E10 (0)  2.3643E+02 (8)  5.6843E13 (0)  3.2862E14 (0)  
Median  2.6631E10 (0)  2.3643E+02 (11)  4.5474E13 (0)  3.2862E14 (0)  
Worst  1.3097E+02 (0)  2.3414E+02 (14)  1.1368E13 (0)  3.2862E14 (0)  
Mean  5.2391E+00  8.0649E+01  3.4560E13  3.2862E14  
Std  2.6195E+01  6.0696E+02  2.0620E13  0.0000E+00 
Table 4 shows the number of FES in each success run as suggested in CEC 2006 Competition: and is feasible. Feasible rate, the success rate, and the success performance are also recorded in Table 4. The feasible rate represents the percentage of runs where at least one feasible solution can be found by PMODE. The success rate denotes the percentage of runs where the PMODE can find a solution that satisfies the success condition. The success performance denotes the mean number of FES for successful runs.
As shown in Table 4, all benchmark functions can find feasible solution with the probability 100% except for g20 and g22, and no feasible solution found yet for these two function. For the success rate, PMODE can arrive 100% for all benchmark function apart from g02, g20, g21 and g22. However, the success rate of g02 and g21 are both over 90% which means the successful runs arise in a majority of trials for these two test functions. Regarding to the success performance, POMDE requires less than FES for 16 test functions, less than FES for 21 test functions and less than FES for 22 test functions to achieve the target error accuracy level.
Prob.  Best  Median  Worst  Mean  Std. 




g01  134184  165520  224488  166889  19031.01  100%  100%  166889  
g02  141048  179808  216256  179421  20577.24  100%  92%  179421  
g03  44104  53376  66680  53123  5275.58  100%  100%  53123  
g04  70328  76392  84688  76745  3366.5  100%  100%  76745  
g05  23616  26560  28696  26497  1266.67  100%  100%  26497  
g06  29272  38088  42992  37602  2938.38  100%  100%  37602  
g07  117504  123496  135192  123904  3918.02  100%  100%  12394  
g08  3008  5920  9064  5970  1610.98  100%  100%  5970  
g09  51032  58264  64408  57842  3854.65  100%  100%  57842  
g10  133384  137504  148248  138412  3881.14  100%  100%  138412  
g11  2192  5888  8248  5655  1340.43  100%  100%  5655  
g12  1240  4576  7784  4643  1926.71  100%  100%  4643  
g13  22096  28048  40112  29042  4696.05  100%  100%  29042  
g14  82040  92016  100432  91817  5069.44  100%  100%  91817  
g15  10288  11960  12808  11839  585.51  100%  100%  11839  
g16  26512  30760  33176  30615  1829.15  100%  100%  30615  
g17  63976  71024  161608  92195  32845.84  100%  100%  92195  
g18  74048  82024  95560  83586  5588.53  100%  100%  83586  
g19  243360  262936  292600  264423  12521.55  100%  100%  264423  
g20            0%  0%    
g21  88040  90052  237656  101595  42137.78  100%  96%  101595  
g22            0%  0%    
g23  171800  199824  231864  199496  19517.02  100%  100%  199496  
g24  14400  24736  29408  23728  4546.14  100%  100%  23728 
4.3 Experimental comparison of PMODE and CMODE
PMODE is compared with CMODE wang2012combining on 24 benchmark test functions. 25 independent runs were executed on each test function and the maximum number of FES was .
Tables 5
reports the detailed comparative results of PMODE and CMODE on function error values and success performance. Additionally, a onesample ttest
mankiewicz2000story was implemented to verify the difference between success performance generated by PMODE and the results of COMDE. But the onesample ttest was not used in function error values because the sample standard deviation in function error values of PMODE sometimes equals toand the ttest is invalid in this case. In the ttest, the null hypothesis is that the sample mean from 25 runs of PMODE equals to the population mean
whose value is taken from wang2012combining . The statistic formula of one sample test is given as follows:(22) 
where denotes the sample mean from PMODE, denotes the sample standard deviation of the sample and denotes the sample size and is the mean from wang2012combining .
Thus, the comparison of the success performance does not only depends on their values, but also should satisfies the statistic significance in the onesample ttest, which means if pvalue , the results of success performance between PMODE and CMODE have no difference. As shown in Table 5, it can be observed that for (denotes function error values), PMODE clearly wins in 15 of 24 test functions (i.e., g03, g04, g06, g07, g08, g10, g13, g14, g15, g17, g18, g21, g23, g24) while CMODE is better in only 4 test functions (i.e., g01, g02, g09, g19). In the aspect of success performance, PMODE can achieve the target error accuracy level by fewer FES in 12 test functions (i.e., g02, g03, g05, g07, g09, g10, g14, g15, g17, g18, g21, g23) while CMODE have better performance in only 6 test functions (i.e., g01, g04, g06, g15, g19, g24). It can be observed that, although PMODE has smaller FES than CMODE, pvalue by onesample ttest in g11, g12 and g13. Thus, there are no difference between the success performance of PMODE and CMODE on g11, g12 and g13 according to the onesample ttest.
Prob.  Success Performance  
PMODE  CMODE  PMODE  CMODE  pvalue  
g01  1.7209E13  0.0000E+00  166889  121077  1.1739E11 
g02  6.7092E04  2.0387E08  179421  189820  1.8520E02 
g03  2.7622E15  1.1665E09  53123  75085  7.2272E12 
g04  3.6379E12  7.6398E11  76745  72748  3.9811E06 
g05  1.8189E12  1.8190E12  26497  28873  3.1053E39 
g06  1.6370E11  3.3651E11  37602  35464  1.3063E03 
g07  2.2851E13  7.9793E11  12394  155968  1.0295E23 
g08  4.1078E17  8.1964E11  5970  5885  7.9285E01 
g09  2.0463E13  9.8198E11  57842  71122  5.1561E15 
g10  7.2759E12  6.2827E11  138412  183255  2.8388E27 
g11  0.0000E+00  0.0000E+00  5655  6023  1.8332E01 
g12  0.0000E+00  0.0000E+00  4643  5009  3.5277E01 
g13  2.1954E16  4.1897E11  29042  30689  9.2308E02 
g14  1.4779E14  8.5159E12  91817  107976  2.8797E14 
g15  1.1368E13  6.0822E11  11839  12855  7.3365E09 
g16  3.7747E15  6.5213E11  30615  29332  1.8059E03 
g17  1.8189E12  1.8189E12  92195  139746  1.7682E07 
g18  2.2204E16  1.5561E11  83586  105020  4.6431E16 
g19  3.9057E09  2.4644E10  264423  251676  3.2721E05 
g21  5.2391E+00  2.6195E+01  101595  128758  4.4012E03 
g23  3.4560E13  4.4772E11  199496  244612  2.7069E11 
g24  3.2862E14  4.6735E12  23728  21820  4.6499E02 
Number of winners  15  4  12  6   
The test problem g20 is not listed in Table 6 since there is no feasible solution can be found. From Table 6, it can be seen that both PMODE and CMODE have good performance in all test functions but except g20 and g22. PMODE and CMODE have same performance for feasible rate in all test functions, where the average feasible rate are both 95.65%. However, PMODE wins again in success rate, although the success rate is not 100% in g02 and g21, PMODE can achieve an average 95.13% , whereas the success rate of CMODE is 94.78% for average.
Prob.  Feasible Rate  Success Rate  
PMODE  CMODE  PMODE  CMODE  
g01  100%  100%  100%  100% 
g02  100%  100%  92%  100% 
g03  100%  100%  100%  100% 
g04  100%  100%  100%  100% 
g05  100%  100%  100%  100% 
g06  100%  100%  100%  100% 
g07  100%  100%  100%  100% 
g08  100%  100%  100%  100% 
g09  100%  100%  100%  100% 
g10  100%  100%  100%  100% 
g11  100%  100%  100%  100% 
g12  100%  100%  100%  100% 
g13  100%  100%  100%  100% 
g14  100%  100%  100%  100% 
g15  100%  100%  100%  100% 
g16  100%  100%  100%  100% 
g17  100%  100%  100%  100% 
g18  100%  100%  100%  100% 
g19  100%  100%  100%  100% 
g21  100%  100%  96%  80% 
g22  0%  100%  0%  0% 
g23  100%  0%  100%  100% 
g24  100%  100%  100%  100% 
Mean  95.65%  95.65%  95.13%  94.78 
4.4 Comparison of PMODE, CMODE and all EAs in CEC 2006 Competition
We compare our experimental results with those in CEC 2006 Competition. The competition data were accessed from the CEC 2016 Special Session website^{2}^{2}2http://www.ntu.edu.sg/home/EPNSugan/index_files/CEC06/CEC06.htm. Accessed on March 16 2018. There were ten EAs participated in the competition. Their characteristics were summerized by Barbosa et.al barbosa2010using as below.

jDE2 huang2006self : a DE algorithm with selfadaptive control parameters and the feasible rule: a feasible solution is better than an infeasible one and the latter are ranked according to the sum over all the constraint violations.

DE zielinski2006constrained : the standard DE algorithm, with the same feasible rule constrainthandling method as jDE2.

SaDE qin2005self : an extension of the original SaDE. The constrainthandling method is similar to the feasible rule used by jDE2 but the constraint violations are weighted.

GDE kukkonen2006constrained : this algorithm extends DE for constrained multiobjective optimization. The constrainthandling method is similar to the feasible rule constrainthandling method as jDE2.

DMSPSO liang2006dynamic : a dynamic and multiple PSO algorithm. The constrainthandling method is similar to SaDE.

MDE mezura2006modified : a DEbased approach modified to solve constrained optimization problems. Its constrainthandling method is similar to similar to the feasible rule constrainthandling method as jDE2.

PESO+ munoz2006peso+ : a PSObased approach with topological organization and constraint handling similar to similar to the feasible rule constrainthandling method as jDE2.

PCX sinha2006population : it is derived from the population based algorithmgenerator and uses the parentcentric recombination (PCX) operator and a stochastic remainder selection over three different constraint handling principles.

_DE Takahama2006ConstrainedOB : it uses the constrainthandling method and employs a gradientbased mutation/repair operator.

MPDE tasgetiren2006multi : a multipopulated DE algorithm with an adaptive penalty method to handle the constraint violations.
Evaluation criteria used in the competition is that for each algorithm, independent runs were implemented for each benchmark test function within a maximum fitness evaluations FES . The tolerance value for the equality constraints was set to . The test problem g20 is not considered in the comparison experiment because no feasible solution can be found.
Table 7 lists the average feasible rate and success rate of all other twentythree test functions tested by twelve EAs (PMODE, CMODE plus all 10 EAs participated in CEC 2006 Competition). DMSPSO, _DE and SaDE can always get feasible solutions among all twentythree test problems while PMODE and CMODE both arrive at a feasible rate 95.65%. DE, MDE and jDE2 have the same performance with PMODE and CMODE in feasible rate. As shown by success rate, _DE achieves 95.65%, which is the highest score again. PMODE and CMODE also have a comparative performance in success rate with 95.13% and 94.78%, respectively.
Algorithms  Feasible Rate  Success Rate 
DE  95.65%  78.09% 
DMSPSO  100%  90.61% 
_DE  100%  95.65% 
GDE  92.00%  77.39% 
jDE2  95.65%  80.00% 
MDE  95.65%  87.65% 
MPDE  94.96%  87.65% 
PCX  95.65%  94.09% 
PESO+  95.48%  67.83% 
SaDE  100%  87.13% 
CMODE  95.65%  94.78 
PMODE  95.65%  95.13% 
EAs  g01  g02  g03  g04  g05  g06  g07  g08 
25115  96222  24861  15281  21306  5202  26578  918  
DE  1.3304  1.4017    1.0461  5.0256  1.3731  3.5290  1.1830 
DMSPSO  1.3272  1.8201  1.0289  1.6625  1.3790  5.3126  1.0000  4.4928 
_DE  2.3615  1.5571  3.5963  1.7156  4.5729  1.4189  2.7957  1.2407 
GDE  1.6133  1.5543  143.8877  1.0000  9.0821  1.2501  4.6654  1.6002 
jDE2  2.0062  1.5163    2.6653  20.9724  5.6686  4.8064  3.5251 
MDE  3.0011  1.0000  1.8096  2.7198  1.0000  1.0000  7.3069  1.0000 
MPDE  1.7292  3.1694  1.0000  1.3666  10.1600  2.0327  2.1597  1.6498 
PCX  2.1981  1.3292  1.4053  2.0279  4.4478  6.5015  4.4067  3.0784 
PESO+  4.0427  4.2905  18.1268  5.2271  21.2267  10.8627  13.8191  6.6710 
SaDE  1.0000  1.9107  12.0254  1.6430  3.4263  2.4118  1.0398  1.4412 
CMODE  4.8209  1.9727  3.0201  4.7606  1.3551  6.8173  5.8683  6.4106 
PMODE  6.6449  1.8646  2.1368  5.0222  1.2436  7.2283  4.6619  6.5032 
EAs  g09  g10  g11  g12  g13  g14  g15  g16 
16152  25520  3000  1308  21732  25220  10458  8730  
DE  1.5976  4.6715  4.4600  3.9021  1.5976  2.7052  5.5429  1.3278 
DMSPSO  1.8237  1.0000  4.8750  4.1356  1.8237  1.0000  2.7634  6.1260 
_DE  1.4315  4.1236  5.4733  3.1529  1.4315  4.4980  8.0528  1.4875 
GDE  1.8716  3.2368  2.8200  2.4075  1.8716  9.1247  7.1605  1.5148 
jDE2  3.4001  5.7269  17.9760  4.8593  3.4001  3.8797  23.0812  3.6306 
MDE  1.0000  6.4326  1.0000  1.0000  1.0000  11.5639  1.0000  1.0000 
MPDE  1.3029  1.9055  7.7854  3.2401  1.3029  1.6937  19.1408  1.4963 
PCX  2.8806  3.4886  12.8960  6.8502  2.8806  2.3488  4.4880  3.4817 
PESO+  6.0391  110.8383  150.0333  6.1835  6.0391    43.0388  5.6174 
SaDE  1.3278  1.7307  8.3703  1.9694  1.3278  1.7843  2.5818  1.7123 
CMODE  4.4032  7.1808  2.0076  3.8295  1.4121  4.2813  1.2292  3.3599 
PMODE  3.5811  5.4236  1.8850  3.5496  1.3363  3.6406  1.1320  3.5068 
EAs  g17  g18  g19  g21  g22  g23  g24  g20 
26364  28261  21830  38217    129550  1794  EX  
DE  50.3891  2.8151  8.1186  4.2571      1.6856  EX 
DMSPSO    1.1741  1.0000  3.6722    1.6251  10.8004  EX 
_DE  3.7498  2.0931  16.3239  3.5362    1.5497  1.6455  EX 
GDE  81.4890  16.9874  10.5489  15.1615    8.2081  1.7051  EX 
jDE2  426.0602  3.6963  9.1548  3.3103    2.7592  5.6834  EX 
MDE  1.0000  3.6617    2.9455    2.7821  1.0000  EX 
MPDE  27.7422  1.5585  5.4180  5.4703    1.6261  2.4204  EX 
PCX  5.1627  2.4779  5.9403  1.0000    1.2900  6.4916  EX 
PESO+    8.2431          11.1371  EX 
SaDE  474.1314  1.0000  2.3896  4.2958    1.0000  2.5775  EX 
CMODE  5.3006  3.7160  11.5289  3.3691    1.8881  12.1627  EX 
PMODE  3.4970  2.9576  12.1128  2.6583    1.5399  13.2263  EX 
Table 8 shows the success performance FEs divided by FEs of the best algorithm among the twelve EAs on twentythree test problems. MDE, SaDE and DMSPSO dominate among all competition algorithms including PMODE and CMODE on success performance, whereas PMODE and CMODE are ranked eighth and ninth respectively.
Table 9 lists the ranking of the twelve EAs in terms of , feasible rate, success rate and success performance respectively. As a result, the final rank is calculated according to the overall ranking of all four measures. As we can see that, _DE and DMSPSO win the first and second places among all twelve EAs respectively. It is worth mentioning that PMODE, proposed algorithm in this paper, is in the third place while CMODE is only ranked seventh. Thus, PMODE gains a clear win against CMODE, and is among the top three EAs. This means PMODE is competitive with other types of EAs too.
Algorithms  Feasible Rate  Success Rate  Success Performance  Final Rank  
DE  9  4  10  6  8 
DMSPSO  4  1  5  3  2 
_DE  2  1  1  4  1 
GDE  12  12  11  10  11 
jDE2  10  4  9  11  10 
MDE  7  4  6  1  4 
MPDE  5  11  7  5  8 
PCX  3  4  4  7  4 
PESO+  11  10  12  12  11 
SaDE  9  1  8  1  6 
CMODE  6  4  3  9  7 
PMODE  1  4  2  8  3 
4.5 Convergence Speed of PMODE
Fig. 14 describes the convergence speed of PMODE. The convergence speed is measured by the average convergence rate defined as follows he2016average :
(23) 
where denotes the normalized convergence speed, the number of current generation, the objective value at generation, and the objective value of the known optimal solution. In addition, may take a negative value since the event could happen. This means, is an infeasible solution but its objective value is less than than which is a feasible solution. In this case, the convergence speed takes a negative value as shown by g23 in Fig. (a)a.
Using the average convergence rate , we can easily evalute and compare the convergence speed of different algorithms. It is better than the logarithmic rate used in many references wang2012combining because the logarithmic rate itself doesn’t provide any information about the convergence rate but only its slop does. However, the average convergence rate provides a quantitative value of the convergence speed.
Fig. 14 indicates the convergence speed of PMODE for 24 benchmark functions. In order to avoid stochastic distribution, the plotting stops at . Since there is a large difference between convergence speed, test functions are divided into 8 groups by required FES, and each subfigure contains two to four lines corresponding to their test functions. The horizontal axis represents FES, while the vertical axis represents . As shown in Figs. (a)a(h)h, the convergence speed of all test functions follow the same rules: from high to low and become steady in the end. The average convergence rate provides a quantitative value of the convergent speed. For example, means that the error at the th generation. Thus provides an exact value of the convergent speed. However the index cannot do it in this way.
For g23, g10 and g21 in Figs. (a)a, (d)d and (g)g, the negative value of means . This means initially an infeasible solution is generated with a good function value but later a feasible solution is found with a worse function value .
In Fig. (h)h, the function g22 is an intractable problem for PMODE which stops at FES. The function error value doesn’t make change after that FES.
5 Conclusions
In this paper, we discover a PCAbased method for identifying the valley direction on a valley landscape. Based on this new method, a new search operator, called the PCAprojection, is designed which projects an individual to a position along the valley direction. Then a new MOEA combining DE, MOEA and PCAprojection is proposed for solving COPs. Experimental results shows that the proposed PMODE not only has significantly improved the solution quality when compared with CMODE, an stateofthe art MOEA for COPs, but is also very competitive with the EAs in CEC 2006 Competition and is ranked third.
In addition we also demonstrate that the average convergence rate is a simple but useful tool for providing a quantitative value of the convergent speed. It is observed that PMODE has different behaviors on the test functions in terms of its convergent speed.
For the future work, a potential extension is the application of the PCAprojection to other types of MOEAs for solving COPs, such as multiobjective evolutionary algorithm based on decomposition zhang2007moea .
References
References
 (1) Z. Michalewicz, M. Schoenauer, Evolutionary algorithms for constrained parameter optimization problems, Evolutionary computation 4 (1) (1996) 1–32.
 (2) C. A. Coello Coello, Theoretical and numerical constrainthandling techniques used with evolutionary algorithms: A survey of the state of the art, Computer Methods in Applied Mechanics and Engineering 191 (1112) (2002) 1245–1287.
 (3) E. MezuraMontes, C. A. Coello Coello, Constrainthandling in natureinspired numerical optimization: past, present and future, Swarm and Evolutionary Computation 1 (4) (2011) 173–194.
 (4) C. Segura, C. A. C. Coello, G. Miranda, C. León, Using multiobjective evolutionary algorithms for singleobjective constrained and unconstrained optimization, Annals of Operations Research 240 (1) (2016) 217–250.
 (5) P. D. Surry, N. J. Radcliffe, The COMOGA method: constrained optimisation by multiobjective genetic algorithms, Control and Cybernetics 26 (1997) 391–412.
 (6) Y. Zhou, Y. Li, J. He, L. Kang, Multiobjective and MGG evolutionary algorithm for constrained optimisation, in: Proceedings of 2003 IEEE Congress on Evolutionary Computation, IEEE Press, Canberra, Australia, 2003, pp. 1–5.
 (7) C. A. Coello Coello, Constrainthandling using an evolutionary multiobjective optimization technique, Civil Engineering Systems 17 (4) (2000) 319–346.
 (8) S. Venkatraman, G. G. Yen, A generic framework for constrained optimization using genetic algorithms, IEEE Transactions on Evolutionary Computation 9 (4) (2005) 424–435.
 (9) Z. Cai, Y. Wang, A multiobjective optimizationbased evolutionary algorithm for constrained optimization, IEEE Transactions on Evolutionary Computation 10 (6) (2006) 658–675.
 (10) Y. Wang, Z. Cai, Combining multiobjective optimization with differential evolution to solve constrained optimization problems, IEEE Transactions on Evolutionary Computation 16 (1) (2012) 117–134.
 (11) Y. Wang, Z. Cai, A dynamic hybrid framework for constrained evolutionary optimization, IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics 42 (1) (2012) 203–217.
 (12) W.F. Gao, G. G. Yen, S.Y. Liu, A dualpopulation differential evolution with coevolution for constrained optimization, IEEE Transactions on Cybernetics 45 (5) (2015) 1094–1107.
 (13) L. Gao, Y. Zhou, X. Li, Q. Pan, W. Yi, Multiobjective optimization based reverse strategy with differential evolution algorithm for constrained optimization problems, Expert Systems with Applications 42 (14) (2015) 5976–5987.
 (14) T. Xu, J. He, C. Shang, W. Ying, A new multiobjective model for constrained optimisation, in: P. Angelov, A. Gegov, C. Jayne, Q. Shen (Eds.), Advances in Computational Intelligence Systems: the 16th UK Workshop on Computational Intelligence, Springer, 2017, pp. 71–85.

(15)
R. Storn, K. Price, Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces, Journal of global optimization 11 (4) (1997) 341–359.
 (16) C. Munteanu, V. Lazarescu, Improving mutation capabilities in a realcoded genetic algorithm, in: Workshops on Applications of Evolutionary Computation, Springer, 1999, pp. 138–149.
 (17) R. Storn, System design by constraint adaptation and differential evolution, IEEE Transactions on Evolutionary Computation 3 (1) (1999) 22–34.

(18)
D. Barber, Bayesian reasoning and machine learning, Cambridge University Press, 2012.

(19)
X. Zhao, W. Lin, Q. Zhang, Enhanced particle swarm optimization based on principal component analysis and line search, Applied Mathematics and Computation 229 (2014) 440–456.
 (20) R. Mankiewicz, The story of mathematics, Cassell, 2000.
 (21) H. J. Barbosa, H. S. Bernardino, A. M. Barreto, Using performance profiles to analyze the results of the 2006 CEC constrained optimization competition, in: Proceedings of 2010 IEEE Congress on Evolutionary Computation, IEEE, 2010, pp. 1–8.
 (22) V. L. Huang, A. K. Qin, P. N. Suganthan, Selfadaptive differential evolution algorithm for constrained realparameter optimization, in: Proceedings of 2006 IEEE Congress on Evolutionary Computation, IEEE, 2006, pp. 17–24.
 (23) K. Zielinski, R. Laur, Constrained singleobjective optimization using differential evolution, in: Proceedings of 2006 IEEE Congress on Evolutionary Computation, IEEE, 2006, pp. 223–230.
 (24) A. K. Qin, P. N. Suganthan, Selfadaptive differential evolution algorithm for numerical optimization, in: Evolutionary Computation, 2005. The 2005 IEEE Congress on, Vol. 2, IEEE, 2005, pp. 1785–1791.
 (25) S. Kukkonen, J. Lampinen, Constrained realparameter optimization with generalized differential evolution, in: Proceedings of 2006 IEEE Congress on Evolutionary Computation, IEEE, 2006, pp. 207–214.
 (26) J. J. Liang, P. N. Suganthan, Dynamic multiswarm particle swarm optimizer with a novel constrainthandling mechanism, in: Proceedings of 2006 IEEE Congress on Evolutionary Computation, IEEE, 2006, pp. 9–16.
 (27) E. MezuraMontes, J. VelázquezReyes, C. C. Coello, Modified differential evolution for constrained optimization, in: Proceedings of 2006 IEEE Congress on Evolutionary Computation, IEEE, 2006, pp. 25–32.
 (28) A. E. MunozZavala, A. HernandezAguirre, E. R. VillaDiharce, S. BotelloRionda, Peso+ for constrained optimization, in: Proceedings of 2006 IEEE Congress on Evolutionary Computation, IEEE, 2006, pp. 231–238.
 (29) A. Sinha, A. Srinivasan, K. Deb, A populationbased, parent centric procedure for constrained realparameter optimization, in: Proceedings of 2006 IEEE Congress on Evolutionary Computation, IEEE, 2006, pp. 239–245.
 (30) T. Takahama, S. Sakai, Constrained optimization by the ε constrained differential evolution with gradientbased mutation and feasible elites, in: IEEE Congress on Evolutionary Computation, 2006, pp. 1–8.
 (31) M. F. Tasgetiren, P. N. Suganthan, A multipopulated differential evolution algorithm for solving constrained optimization problem, in: Proceedings of 2006 IEEE Congress on Evolutionary Computation, IEEE, 2006, pp. 33–40.
 (32) J. He, G. Lin, Average convergence rate of evolutionary algorithms, IEEE Transactions on Evolutionary Computation 20 (2) (2016) 316–321.
 (33) Q. Zhang, H. Li, MOEA/D: A multiobjective evolutionary algorithm based on decomposition, IEEE Transactions on evolutionary computation 11 (6) (2007) 712–731.
Comments
There are no comments yet.