EPE-NAS: Efficient Performance Estimation Without Training for Neural Architecture Search

02/16/2021
by   Vasco Lopes, et al.
0

Neural Architecture Search (NAS) has shown excellent results in designing architectures for computer vision problems. NAS alleviates the need for human-defined settings by automating architecture design and engineering. However, NAS methods tend to be slow, as they require large amounts of GPU computation. This bottleneck is mainly due to the performance estimation strategy, which requires the evaluation of the generated architectures, mainly by training them, to update the sampler method. In this paper, we propose EPE-NAS, an efficient performance estimation strategy, that mitigates the problem of evaluating networks, by scoring untrained networks and creating a correlation with their trained performance. We perform this process by looking at intra and inter-class correlations of an untrained network. We show that EPE-NAS can produce a robust correlation and that by incorporating it into a simple random sampling strategy, we are able to search for competitive networks, without requiring any training, in a matter of seconds using a single GPU. Moreover, EPE-NAS is agnostic to the search method, since it focuses on the evaluation of untrained networks, making it easy to integrate into almost any NAS method.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/11/2020

Differential Evolution for Neural Architecture Search

Neural architecture search (NAS) methods rely on a search strategy for d...
research
06/08/2020

Neural Architecture Search without Training

The time and effort involved in hand-designing deep neural networks is i...
research
02/21/2020

DSNAS: Direct Neural Architecture Search without Parameter Retraining

If NAS methods are solutions, what is the problem? Most existing NAS met...
research
09/23/2022

NasHD: Efficient ViT Architecture Performance Ranking using Hyperdimensional Computing

Neural Architecture Search (NAS) is an automated architecture engineerin...
research
09/14/2020

RelativeNAS: Relative Neural Architecture Search via Slow-Fast Learning

Despite the remarkable successes of Convolutional Neural Networks (CNNs)...
research
12/17/2019

Generative Teaching Networks: Accelerating Neural Architecture Search by Learning to Generate Synthetic Training Data

This paper investigates the intriguing question of whether we can create...
research
07/21/2022

Efficient Search of Multiple Neural Architectures with Different Complexities via Importance Sampling

Neural architecture search (NAS) aims to automate architecture design pr...

Please sign up or login with your details

Forgot password? Click here to reset