Neural Architecture Search using Progressive Evolution

03/03/2022
by   Nilotpal Sinha, et al.
0

Vanilla neural architecture search using evolutionary algorithms (EA) involves evaluating each architecture by training it from scratch, which is extremely time-consuming. This can be reduced by using a supernet to estimate the fitness of every architecture in the search space due to its weight sharing nature. However, the estimated fitness is very noisy due to the co-adaptation of the operations in the supernet. In this work, we propose a method called pEvoNAS wherein the whole neural architecture search space is progressively reduced to smaller search space regions with good architectures. This is achieved by using a trained supernet for architecture evaluation during the architecture search using genetic algorithm to find search space regions with good architectures. Upon reaching the final reduced search space, the supernet is then used to search for the best architecture in that search space using evolution. The search is also enhanced by using weight inheritance wherein the supernet for the smaller search space inherits its weights from previous trained supernet for the bigger search space. Exerimentally, pEvoNAS gives better results on CIFAR-10 and CIFAR-100 while using significantly less computational resources as compared to previous EA-based methods. The code for our paper can be found in https://github.com/nightstorm0909/pEvoNAS

READ FULL TEXT
research
04/01/2022

Novelty Driven Evolutionary Neural Architecture Search

Evolutionary algorithms (EA) based neural architecture search (NAS) invo...
research
11/22/2020

Evolving Search Space for Neural Architecture Search

The automation of neural architecture design has been a coveted alternat...
research
09/20/2019

Genetic Neural Architecture Search for automatic assessment of human sperm images

Male infertility is a disease which affects approximately 7 morphology a...
research
08/13/2020

Can weight sharing outperform random architecture search? An investigation with TuNAS

Efficient Neural Architecture Search methods based on weight sharing hav...
research
04/01/2021

EfficientNetV2: Smaller Models and Faster Training

This paper introduces EfficientNetV2, a new family of convolutional netw...
research
01/30/2022

Augmenting Novelty Search with a Surrogate Model to Engineer Meta-Diversity in Ensembles of Classifiers

Using Neuroevolution combined with Novelty Search to promote behavioural...
research
07/15/2021

Neural Architecture Search using Covariance Matrix Adaptation Evolution Strategy

Evolution-based neural architecture search requires high computational r...

Please sign up or login with your details

Forgot password? Click here to reset