Novelty Driven Evolutionary Neural Architecture Search

04/01/2022
by   Nilotpal Sinha, et al.
0

Evolutionary algorithms (EA) based neural architecture search (NAS) involves evaluating each architecture by training it from scratch, which is extremely time-consuming. This can be reduced by using a supernet for estimating the fitness of an architecture due to weight sharing among all architectures in the search space. However, the estimated fitness is very noisy due to the co-adaptation of the operations in the supernet which results in NAS methods getting trapped in local optimum. In this paper, we propose a method called NEvoNAS wherein the NAS problem is posed as a multi-objective problem with 2 objectives: (i) maximize architecture novelty, (ii) maximize architecture fitness/accuracy. The novelty search is used for maintaining a diverse set of solutions at each generation which helps avoiding local optimum traps while the architecture fitness is calculated using supernet. NSGA-II is used for finding the pareto optimal front for the NAS problem and the best architecture in the pareto front is returned as the searched architecture. Exerimentally, NEvoNAS gives better results on 2 different search spaces while using significantly less computational resources as compared to previous EA-based methods. The code for our paper can be found in https://github.com/nightstorm0909/NEvoNAS.

READ FULL TEXT
research
03/03/2022

Neural Architecture Search using Progressive Evolution

Vanilla neural architecture search using evolutionary algorithms (EA) in...
research
12/23/2020

Evolving Neural Architecture Using One Shot Model

Neural Architecture Search (NAS) is emerging as a new research direction...
research
07/15/2021

Neural Architecture Search using Covariance Matrix Adaptation Evolution Strategy

Evolution-based neural architecture search requires high computational r...
research
03/23/2023

OFA^2: A Multi-Objective Perspective for the Once-for-All Neural Architecture Search

Once-for-All (OFA) is a Neural Architecture Search (NAS) framework desig...
research
04/20/2020

Local Search is a Remarkably Strong Baseline for Neural Architecture Search

Neural Architecture Search (NAS), i.e., the automation of neural network...
research
07/13/2022

MRF-UNets: Searching UNet with Markov Random Fields

UNet [27] is widely used in semantic segmentation due to its simplicity ...
research
01/28/2021

Evolutionary Neural Architecture Search Supporting Approximate Multipliers

There is a growing interest in automated neural architecture search (NAS...

Please sign up or login with your details

Forgot password? Click here to reset