Safe Crossover of Neural Networks Through Neuron Alignment

03/23/2020
by   Thomas Uriot, et al.
0

One of the main and largely unexplored challenges in evolvingthe weights of neural networks using genetic algorithms is to finda sensible crossover operation between parent networks. Indeed,naive crossover leads to functionally damaged offspring that donot retain information from the parents. This is because neuralnetworks are invariant to permutations of neurons, giving riseto multiple ways of representing the same solution. This is oftenreferred to as the competing conventions problem. In this paper, wepropose a two-stepsafe crossover(SC) operator. First, the neuronsof the parents are functionally aligned by computing how well theycorrelate, and only then are the parents recombined. We comparetwo ways of measuring relationships between neurons: PairwiseCorrelation (PwC) and Canonical Correlation Analysis (CCA). Wetest our safe crossover operators (SC-PwC and SC-CCA) on MNISTand CIFAR-10 by performing arithmetic crossover on the weightsof feed-forward neural network pairs. We show that it effectivelytransmits information from parents to offspring and significantlyimproves upon naive crossover. Our method is computationally fast,can serve as a way to explore the fitness landscape more efficientlyand makes safe crossover a potentially promising operator in futureneuroevolution research and applications.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset