1. Introduction
Evolutionary algorithms are natureinspired models drawn for observations of organic evolution. Using selection, recombination, and mutation, this family of algorithms evolves populations to search an optimization domain with respect to the individual fitness within the population. Related to the fields of biology, numerical optimization, and artificial intelligence, these algorithms may also model a collective learning process, where individuals may not only represent a point on the domain of the objective function, but also may represent knowledge of an environment
(Bck1996EvolutionaryAI). This family of algorithms are particularly well suited for Blackbox optimization, where there is no knowledge of the internal structure of the objective function, in that there is no need to calculate a gradient to search the landscape, which could be highly nonlinear and rugged with many local optima.Quantum computing has steadily shown an increasing potential for disrupting limitations imposed by seemingly intractable problems (Kanamori). By leveraging properties of quantum mechanics, such as superposition, entanglement and interference, researchers realized theoretical speedups for processes such as order finding, factoring, and search (simons; Shor1999PolynomialTimeAF; grovers). In our current era of noisyintermediate scale quantum devices (NISQ) (NISQ)
, as quantum computing hardware continues to evolve, it is not without growing pains as it has been realized that coherence times and fault tolerance to external noise are not perfect. For algorithms with a provably exponential speedup over classical algorithms, hardware with many physical qubits (quantum bits), low noise, and long coherence times to may be required to realize circuits with the required depth and scale. However, in the near term, there exists the possibility to investigate if there are subcomponents of routines for classical algorithms that may be intractable for classical computation which may benefit from leveraging qualitative performance enhancements from a NISQera quantum computing systems leveraging quantum effects.
In this work we examine leveraging a quantum annealing system to find solutions to the problem of optimal selection within an evolutionary algorithm, encoded as a binary quadratic model. We examine the tradeoff in selective pressure vs. exploration in the evolutionary search, and show qualitative gains with respect to fitness and expected runtimes in the form of average generations to convergence. We investigate these performance gains with respect to the change in the ratio of to , or the size of the selected parent pool and number of offspring, and find that the gap in performance grows as the ratio approaches /=2. This is not without a cost, as we also observe an additional overhead in compute times, which are incurred by making calls across a network to query a quantum processing unit at each generation, an issue also identified in (sharabiani2021quantum). We also confirm findings the authors reported in this work, where for fully connected graphs of input QUBOs (Quadratic Unconstrained Binary Optimization) constructed from randomly initialized populations, hybrid quantumclassical outperform fully quantum solvers by finding lower energy configurations given the input.
We test our quantumenhanced algorithms on the IOHprofiler suite for blackbox optimization, specifically examining PseudoBoolean functions (IOHprofiler).We find that for functions with perturbed fitness, quantumenhanced selection operators achieve slightly better performance to their classical counterparts. We find that our quantum enhanced algorithms generally match or outperform their classical counterparts on a majority of test functions, 10 out of 15 test functions, tested for significance with pvalues ¡ 0.05.
2. Related Works
Quantuminspired evolutionary algorithms have been well studied over the years, starting with (542334), where classical simulation of quantum mechanical properties were applied to evolutionary search. This culminated in a large body of work with many variants of quantuminspired algorithms as described by Zhang in (Zhang2011).
In surveying quantuminspired algorithms, Zhang noted 3 types of algorithms which combine quantum computational properties with evolutionary algorithms (Zhang2011). These include:

Evolutionary Designed Quantum Algorithms (EDQA), which leverage evolutionary algorithms to evolve new designs of quantum algorithms

Quantum Evolutionary Algorithms (QEA), where evolutionary algorithms are implemented on a quantum computer

QuantumInspired Evolutionary Algorithms (QIEA), which are algorithms where the evolutionary process is supplemented by routines inspired by quantum mechanics, but implemented using classical hardware.
Along with these, we propose to consider an additional algorithm class. As we are currently in the NISQ era for quantum hardware, we can also examine hybrid quantumclassical algorithms, where some portions or subroutines of the algorithm’s execution are performed on a quantum computer, and other portions are performed classically. We call this type QuantumEnhanced Evolutionary Algorithms, and give an example illustration of this concept in Fig. 1.
Studies into quantumenhanced evolutionary algorithms can be examined from a standpoint of leveraging a quantum device and quantum mechanical properties for selection, crossover, or mutation operators within the heuristic of the evolutionary search.
Of these evolutionary operators, the idea for a genetic algorithm assisted by quantum annealing was proposed by Chancellor in (ChancellorGenetic), and the authors of (king2019quantumassisted) investigated using a quantumassisted mutation operator, and leveraging reverse annealing runs using a quantum annealer. By performing qausilocal searches using the quantumassisted mutation operator, the authors were able to show an improvement over forward quantum annealing in finding global optima for a set of input spinglasses. More recently, investigation into continuous black box optimization leveraging a quantumassisted acquisition function have been reported in (izawa2021continuous). In (sharabiani2021quantum), the authors leverage a quantum annealing system to formulate continuous optimization problems cast within a quantum nonlinear programming framework, and show applications within the green energy space. Sharabiani et. al also identified the overhead in compute times in regards to querying a QPU for an subroutine for their optimization algorithm in their work. To our knowledge, there has be no prior investigation into quantumenhanced routines for selection, which our work addresses.
Turning our attention away from quantum annealing for a moment, there also exists a stream of research into applying Grovers’s algorithm for unstructured search to global optimization
(Baritompa). While we may not have systems of the scale and fidelity available today to implement these algorithms on a practical level, this stream of research could be further investigated and realized as quantum systems come online with higher orders of available error corrected qubits and longer coherence times.To motivate our work, we frame the selection process as a Maximum Diversity problem, where we wish to select a subset of parents for crossover and/or mutation, which preserve a high degree of quality diversity within the parent pool, with low genotype similarity between parents, while preserving a high degree of fitness in regards to the objective function. Maximum Diversity Problems have been shown to be NPHard (Duarte07tabusearch; DeAndrade). The difference in our formulation from classic Maximum Diversity Problems is that we do not use Euclidean distance as our distance function, but investigate other distance functions such as Hamming distance. This type of combinatorial optimization scales with respect to the input , and is dominated by , as there is a binary decision variable for each individual within the population. However, we propose that by leveraging a quantum processing unit, or other hybrid quantumenhanced or quantuminspired methods, we may be able to sample approximate solutions of quality in comparison to other techniques and heuristics. This is the main motivation for leveraging a quantum computing system for this work.
3. Methods
3.1. Evolutionary Algorithms
Evolutionary Algorithms model processes observed in nature including natural selection, reproduction and mutation. In regards to global optimization, these processes are leveraged to evolve individual solutions with respect to an objective or fitness function. In the case of a Genetic Algorithm for PseudoBoolean Optimization, A population of individuals := [, … ] is initialized, where the the values of are generated at random uniformly. At each generation, individuals (also referred to as chromosomes in the case of genetic algorithms) from a population are selected, recombined to produce offspring, and mutated, resulting in a new population. Over time individuals converge to minima of the objective function to be optimized with respect to their fitness values, given by a black box objective function . These properties make evolutionary algorithms powerful candidates for optimization for black box functions where the domain and modality of the function is unknown for both discrete, as in the case of pseudoBoolean optimization, and continuous optimization.
3.2. Selection Operators for Evolutionary Algorithms
In selecting parents from a population pool for mutation and crossover, we may choose from a number of selection operators. These operators may be deterministic or probabilistic and can include:

() selection In () selection, individuals are selected based on the rank of their fitness values to create offspring. In this case, only child offspring are included in the subsequent population.

() selection In () selection, individuals are selected based on the rank of their fitness values to create offspring. () selection is elitist, meaning the parents are included in subsequent generations. In our experiments, we also tracked and recombined the best solution found so far with selected members of the population to create subsequent generations.
We choose these operators in comparison to our quantumenhanced operator in order to compare and contrast the trade offs in exploration vs. exploitation in our evolutionary search.
3.3. A QuantumEnhanced Selection Operator
Maximum diversity problems are characterized by selecting elements from a set which maximize diversity within the selected subset. Kuo et. al (Kuo1993) gave a formulation for this set of problems as a binary quadratic model, which were also shown to be NPHard.
When given a set of candidates within a population pool, a natural question is how to select parents with a high degree of fitness, yet are also diverse from one another, with the idea that we want to be able to balance the trade off between exploitation and exploration in or selection mechanism. Low population diversity may lead to more localized search and premature convergence. Similar to the feature subset selection problem in machine learning
(vondollen2021quantumassisted), the problem of selecting the optimal subset of parents from a population pool can be framed as a binary quadratic model. In our formulation we start by defining a matrix :(1) 
Where is the fitness evaluated by the chromosome , and is the pairwise distance metric between chromosomes within the population. Note that for this formulation the distance metric may be arbitrarily chosen by the practitioner, in our case we use hamming distance, where we negate the quadratic term as bit strings with higher values for hamming distance may be more distant in the sampled hypercube.
We introduce the terms and as scaling constants, which we may use to adjust the optimization domain for the binary quadratic model for more or less ruggedness in order to leverage the effect of quantum tunneling. Depending on the choice of distance metric, one may want to use a negative value for the scaling term applied to the quadratic terms of the QUBO, in the case where a higher value for the distance metric represents a higher degree of similarity or correlation. We may tune the parameter to increase or decrease the selective pressure, giving more or less weight to individuals with respect to their fitness. The term allows to increase or decrease the diversity in the population of selected individuals in regards to the distance between their chromosomes. Overall the action of the two terms help to maintain selective pressure while also achieving a balance of diversity within the selected population.
The matrix acts as input for our resulting minimization problem, where we wish to find an optimal assignment of qubit values, represented as where and to indices of the population which is of size , where we select members of the population with a value of of size
(2) 
Using this population subset of size , we may then perform crossover and mutation classically, creating a new population and increment to the next generation. For our experiments we investigate elitist and nonelitist versions of the quantumenhanced operator, where the parents are included in the former case and not in the latter case. We also include the heuristic of recombining the selected population with the best solution found so far in the elitist version for both classical and quantumenhanced operators. We outline the pseudocode for these algorithms in Fig. 2.
4. Experiments
4.1. Benchmarking Quantum, Hybrid, and Classical Solvers
For our version of a quantumenhanced evolutionary algorithm, we make calls to the DWave quantum annealer, using the quantum processing unit (QPU). During the annealing regime, the system starts in a state of superposition for all qubit values, and by gradually reducing the amplitude of a transverse field, drives the system to a ground state. By leveraging quantummechanical properties such as entanglement and superposition, we may observe an effect know as quantum tunneling, where barriers in the optimization landscape are surpassed, instead of walked or sampled over.
The Dwave 2000Q QPU is composed of 2000 qubits and 5600 couplers, with 128000 Josephson junctions. As the QPU may not have full connectivity, as it uses a chimera architecture, so a minor embedding is created to model the fully connected graph on the chip. For our purposes we used Dwave’s software tools to automatically create a minor embedding on the QPU for our problem to be sampled (ocean).
Before approaching the problem of utilizing the quantumenhanced selection operator, it is natural to question how well a particular solver may find energy minima for the formulation of the binary quadratic model. In order to ascertain solution quality, we randomly initialized populations of solutions according to step 1 in Algorithm 1 in Fig. 2, with which we constructed matrices according to equations 1. and 2. We then ran trials for each solver type, with the set of solvers consisting of:

DWave 2000Q  DWave Sampler (DwS)

DWave 2000Q  DWave Clique Sampler (DwCS)

Leap Hybrid Sampler (LHS)

Simulated Annealing (SA)

Steepest Descent (SD)
For the quantum samplers, DWave provides tools to embedding the QUBO on the chip. In the case of the DWave Sampler, a minor embedding using the embedding composite tool was constructed for this purpose to map the problem onto the QPU. For the DWave Clique Sampler, the tool attempts to find clique embedding on the chip of equal chain length. An important parameter, chain strength, was set as the maximum absolute value of the linear terms of the initialized binary quadratic model for both quantum samplers and embedding tools (ocean).
DWave also provides access to a hybrid sampler, which leverages both classical and quantum calls within it’s subroutine. This technology is proprietary to DWave, and therefore we treat this sampler as a black box, and assume that some component of the subroutine leverages calls to a quantum processing unit. For the classical solvers, simulated annealing and steepest descent are relatively straight forward in their implementations and may be reviewed per DWave’s documentation (ocean).
In our tests we examined varying the values of the parameters and to see if there was any change in the solution quality according to the distributions of minimum energies found by each sampler. Generally, we found that the Hybrid sampler achieved best performance, and the fully quantum samplers were less performant (Fig 3., Fig 4.), when run over the same set of input QUBOs. This could be attributed to the fully connected nature of the input problem, where the embedding found of the decomposition of the fully connected graph for the quantum samplers may not be optimal with respect to the QPU architecture. Since the Hybrid Sampler achieved the best results over different values of and (Fig. 3, 4), we used this sampler for our experiments over the OneMax function and functions from IOHProfiler.
4.2. OneMax
In our initial experiments we benchmarked our quantumenhanced genetic algorithm against genetic algorithms with , over the OneMax function.
Following (SE91),the members of our population are vectors of length , , where . We wish to find the bitstring which maximizes the function:
(3) 
The optimum of which is essentially a vector of ones, where = .
4.3. IOHExperimenter
For our experiments, we used the IOHProfiler software library (IOHprofiler) for blackbox optimization. IOHProfiler contains a suite of PseudoBoolean functions with which to benchmark optimization algorithms. The functions selected for our benchmarking include function IDs (fids) 418 (Table 1.). The functions 417 are variants of OneMax and LeadingOnes, and are Wmodel transformed, using dummy variables (DV), neutrality (Neu), epistasis (Eps), and fitness perturbation (FP).
Table of Wmodel transformed objective functions  

FID  function  DV  Neu  Eps  FP 
4  OneMax  1  1  
5  OneMax  1  1  
6  OneMax  3  1  
7  OneMax  4  1  
8  OneMax  1  1  
9  OneMax  1  1  
10  OneMax  1  1  
11  LeadingOnes  1  1  
12  LeadingOnes  1  1  
13  LeadingOnes  3  1  
14  LeadingOnes  4  1  
15  LeadingOnes  1  1  
16  LeadingOnes  1  1  
17  LeadingOnes  1  1 
Function 18 from IOHProfiler is an instance the Low AutoCorrelation Binary Sequence problem, where the fitness is determined by the reciprocal over the sequence’s autocorrelation (IOHprofiler).
4.4. Experiment Parameters, Performance Metrics and Quality Diversity
In order to compare selection operators as part of a larger heuristic, we set some parameters within the genetic algorithm to be static across all experiments. For our choice of mutation rate, we used a rate of . For our chromosome size, , we set For the elitist versions of the classical and quantumenhanced algorithms, we chose to track the best solution found so far, and recombined with the parent pool chosen by the selection operator to create offspring. In the nonelitist versions, we only used the parent pool and recombined with members of the population drawn with uniform probability. For our population size, we chose a size = 50 for all experiments, and examined the change in the size of the selected parent pool, in relation to the number of offspring, . For the (, ) operator, we set the number of children per parent to the value of /. For settings of and , for we set and and for we set and .
In our experiments, we examined the expected runtime, which we define as the average number of generations to the target solution per run, with a total of 20 runs for each experiment. We set a budget of 50 generations for each experiment. We also took into account best fitness values found at each generation, which we averaged over all runs.
We measured the quality at each generation by taking the average of the pairwise hamming distances of all members of the population at each generation, and averaging over these in each individual run, finally taking the average for all 20 runs.
5. Results
We plotted the results as distributions of energies per sampler on random initialization of populations with varying values of and in figures 3 and 4. We plotted the gap observed in the change in ratio of / vs. average generations to convergence over OneMax in figure 5. We plotted the average fitness and log of average genotype diversity over the One Max Function in figures 6 and 7. We tabulated performance results with quantumenhanced vs. classical algorithms in Tables 2,3 and 4. We highlighted best performing in bold, ranked in order by average fitness, average generations to convergence, and average genotype diversity. For fids 11,12,13,18 the operator outperformed other operators in terms of fitness and average generations to convergence. For fids 7, 10, 14, 15, 16, 17 the
outperformed other versions with regards to fitness values. To verify the significance of these results we ran ttests for independence over the samples of trials vs. their classical versions, and found all within significant range (
¡0.05).6. Discussion
In examining the performance of the quantum enhanced algorithms, we plotted the average generations to convergence and genotype diversity over the OneMax function as shown in Fig. 6 and Fig. 7. In comparing the quantumenhanced methods to their classical counterparts we see that the quantumenhanced achieve high velocity towards convergence in Fig. 7., indicating higher selective pressure with lower average generations to convergence. Taking a closer look at the difference between average generations for and it’s classical counterpart, in Fig. 5, we notice a gap in expected runtime in generations as the ratio of approaches 0.5. This could be attributed to higher weightings of vs. , where preference is given to fitness within the QUBO of the quantumenhanced selection mechanism. We also see that there is a tradeoff when comparing the average log genotype diversity of populations as indicated in Fig. 6, where shows on average a higher log diversity to it’s classical counterpart. Surprisingly, when testing over the IOHProfiler suite, we notice the average genotype diversity being lower for than . This could also be due the the weighting of the and terms, and the trade off in optimizing for fitness and lower expected runtime vs. diversity. We also see this trade off when applied to fids 418 in tables 24, where the elitist versions achieve higher fitness on functions with dummy variables and neutrality, and nonelitist versions performing well on functions with perturbations of the ruggedness function. This indicates that diversity may help to overcome these perturbations, which may lead searches with higher velocity to converge into local minima, and achieving a proper balance in weightings for terms in the QUBO when optimizing for this criterion is crucial.
In the tradeoff of exploration vs. exploitation, for elitist vs. nonelitist algorithms, we note the main difference in recombining the best solution found so far with the selected population in the elitist cases. In the cases where we tuned , we noticed a decrease in diversity for an increase in velocity. This leads us to believe that there may be some potential in future work towards characterizing the objective function, and incorporating a switching mechanism within the optimization routine based upon whether exploitation or exploration may be more or less advantageous.
While we used static mutation rates of 0.2, in our experiments we noticed the interplay between selection, crossover, and mutation. As we wanted to only examine the effects of the selection operator, we kept the other operators static, but we believe that future work could also examine adaptive population sizes, as well as the effect of adaptive mutation rates, which could help to reduce the diminished genotype diversity towards the end of the optimization regime as observed in Fig. 6.
7. Conclusion
We conclude that our quantumenhanced selection operator shows some advantages in velocity and exploration within the population selection mechanism, although there is also a trade off in the latency for compute times on current NISQ chips. Future work could extend this method to Evolution Strategies to search over continuous function domains, as well as potential applications such as hyperparameter optimization and neural architecture search for machine learning. Future work could also incorporate streams of research previously identified in the related works section, such as Grover’s search for global optimization. Finally, future work could examine quantumenhanced surrogate modeling for both single and multiobjective optimization.
8. Acknowledgments
The authors declare no conflicts of interest.
Quantumenhanced selection operator with elitism and recombination with best solution Quantumenhanced selection operator without elitism, and only recombining members of selected population without tracking best solution Genotype diversity, the average amount of diversity measured between genotypes in a population. Average generations to convergence or stopping criterion, also denoted as expected runtime.
IOH Function results ( , , 20 trials)  

IOH FID  Mean Fitness  Mean Fitness  Mean Fitness  Mean Fitness 
4  25 (+/ 0.0)  25 (+/ 0)  25 (+/ 0)  25 (+/ 0) 
5  44.95 (+/ 0.21)  45.0 (+/ 0.0)  45 ( +/ 0.)  45.0 (+/ 0.) 
6  16 (+/ 0.0)  15.95 (+/ 0.2)  16 (+/ 0.0)  16 (+/ 0.0) 
7  43.45 (+/ 1.01)  43.05 (+/ 1.74)  46.65 (+/ 1.31)  43.45 (+/1.65) 
8  25.25 (+/ 0.43)  24.85 (+/ 0.57)  25.4 (+/ 0.0)  25.95 (+/ 0.0) 
9  49.1 (+/ 0.83)  48.9 (+/ 0.7)  49.6 (+/ 0.21)  49.9 (+/ 0.3) 
10  34.5 (+/ 4.15  34.5 (+/ 3.1)  48.0 (+/ 2.0)  45.5 (+/ 2.29) 
11  25 (+/ 0.0)  25 (+/ 0.0)  24.3 (+/ 1.3)  23.9 (+/ 1.69) 
12  42.1 (+/ 2.9)  40.0 (+/ 5.2)  25.45 (+/ 3.2)  23.5 (+/ 3.45) 
13  16 (+/ 0.0)  16 (+/ 0.0)  16 (+/ 0.0)  16 (+/ 0.0) 
14  10.1 (+/ 4.7)  10.25 (+/ 4.14)  12.35 (+/ 5.8)  9.65 (+/ 3.2) 
15  11.25 (+/ 4.3)  11.8 (+/ 5.3)  13.3 +/(1.9)  12.15 (+/ 2.15) 
16  17.55 (+/ 7.14)  17.4 (+/ 6.0)  24.95 (+/ 4.2)  20.85 (+/ 5.1) 
17  9.25 (+/ 3.34)  9.5 (+/ 4.15)  16.9 (+/ 5.1)  10.75 (+/ 4.26) 
18  4.01 (+/ 0.35)  3.81 (+/ 0.31)  2.79 (+/ 0.23)  3.09 (+/ 0.32) 
IOH Function results ( , , 20 trials)  

IOH FID  Average G  Average G  Average G  Average G 
4  10.2 (+/ 1.9)  9.8 (+/ 2.1)  11.8 (+/ 1.5)  11.85 (+/ 1.95) 
5  23.2 (+/ 6.6)  21.3 (+/ 2.7)  30.15 (+/ 4.4)  26.55 (+/ 4.9) 
6  12.25 (+/ 7.1)  21.3 (+/ 15.0)  9.7 (+/ 2.3)  8.85 (+/ 2.55) 
7  50 (+/ 0.0)  50 (+/ 0.0)  50 (+/ 0.0)  50 (+/ 0.0) 
8  46 (+/ 7.45)  49.55 (+/ 1.35)  44.75 (+/ 8.04)  35.4 (+/ 5.64) 
9  44.2 (+/ 9.5)  48.35 ( +/ 5.1)  44.1 (+/ 6.26)  38.55 (+/ 7.2) 
10  50 (+/ 0.0)  50 (+/ 0.0)  50 (+/ 0.0)  50 (+/ 0.0) 
11  17.85 (+/ 4.8)  20.75 (+/ 5.7)  30.7 (+/ 9.3)  43.55 (+/ 8.17) 
12  48.15 (+/ 3.7)  45.5 (+/ 5.2)  50 (+/ 0.0)  50 (+/ 0.0) 
13  9.85 (+/ 4.7)  14.3 (+/ 10.6)  21.65 (+/ 6.5)  22.5 (+/ 9.48) 
14  50 (+/ 0.0)  50 (+/ 0.0)  50 (+/ 0.0)  50 (+/ 0.0) 
15  50 (+/ 0.0)  50 (+/ 0.0)  50 (+/ 0.0)  50 (+/ 0.0) 
16  50 (+/ 0.0)  50 (+/ 0.0)  50 (+/ 0.0)  50 (+/ 0.0) 
17  50 (+/ 0.0)  50 (+/ 0.0)  50 (+/ 0.0)  50 (+/ 0.0) 
18  50 (+/ 0.0)  50 (+/ 0.0)  50 (+/ 0.0)  50 (+/ 0.0) 
IOH Function results ( , , 20 trials)  

IOH FID  Average GD  Average GD  Average GD  Average GD 
4  0.45 (+/ 0.02)  0.68 (+/ 0.02)  0.62 (+/ 0.0)  0.68 (+/ 0.01) 
5  0.4 (+/ 0.01)  0.68 (+/ 0.02)  0.57 (+/ 0.01)  0.69 (+/ 0.02) 
6  0.45 (+/ 0.04)  0.68 (+/ 0.02)  0.63 (+/ 0.01)  0.69 (+/ 0.02) 
7  0.35 (+/ 0.01)  0.67 (+/ 0.02)  0.56 (+/ 0.01)  0.68 (+/ 0.02) 
8  0.37 (+/ 0.01)  0.67 (+/ 0.01)  0.57 (+/ 0.01)  0.67 (+/ 0.01) 
9  0.36 (+/ 0.02)  0.68 (+/ 0.01)  0.55 (+/ 0.01)  0.68 (+/ 0.02) 
10  0.36 (+/ 0.02)  0.68 (+/ 0.02)  0.59 (+/ 0.0)  0.68 (+/ 0.02) 
11  0.45 (+/ 0.02)  0.68 (+/ 0.01)  0.58 (+/ 0.01)  0.69 (+/ 0.02) 
12  0.40 (+/ 0.01)  0.68 (+/ 0.02)  0.59 (+/ 0.0)  0.68 (+/ 0.02) 
13  0.48 (+/ 0.04)  0.69 (+/ 0.01)  0.61 (+/ 0.01)  0.68 (+/ 0.02) 
14  0.35 (+/ 0.01)  0.68 (+/ 0.02)  0.61 (+/ 0.01)  0.68 (+/ 0.01) 
15  0.36 (+/ 0.01)  0.68 (+/ 0.02)  0.59 (+/ 0.0)  0.68 (+/ 0.02) 
16  0.36 (+/ 0.01)  0.69 (+/ 0.02  0.59 (+/ 0.01)  0.67 (+/ 0.02) 
17  0.34 (+/ 0.01)  0.68 (+/ 0.02)  0.61 (+/ 0.01)  0.68 (+/ 0.02) 
18  0.34 (+/ 0.01)  0.69 (+/ 0.01)  0.64 (+/ 0.0)  0.68 (+/ 0.02) 