Weak NAS Predictors Are All You Need

02/21/2021
by   Junru Wu, et al.
9

Neural Architecture Search (NAS) finds the best network architecture by exploring the architecture-to-performance manifold. It often trains and evaluates a large number of architectures, causing tremendous computation costs. Recent predictor-based NAS approaches attempt to solve this problem with two key steps: sampling some architecture-performance pairs and fitting a proxy accuracy predictor. Given limited samples, these predictors, however, are far from accurate to locate top architectures. In this paper, we shift the paradigm from finding a complicated predictor that covers the whole architecture space to a set of weaker predictors that progressively move towards the high-performance sub-space. It is based on the key property of the proposed weak predictors that their probabilities of sampling better architectures keep increasing. We thus only sample a few well-performed architectures guided by the previously learned predictor and estimate a new better weak predictor. By this coarse-to-fine iteration, the ranking of sampling space is refined gradually, which helps find the optimal architectures eventually. Experiments demonstrate that our method costs fewer samples to find the top-performance architectures on NAS-Bench-101 and NAS-Bench-201, and it achieves the state-of-the-art ImageNet performance on the NASNet search space. The code is available at https://github.com/VITA-Group/WeakNAS

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/09/2020

Neural Architecture Search with GBDT

Neural architecture search (NAS) with an accuracy predictor that predict...
research
01/30/2022

Neural Architecture Ranker

Architecture ranking has recently been advocated to design an efficient ...
research
03/28/2020

NPENAS: Neural Predictor Guided Evolution for Neural Architecture Search

Neural architecture search (NAS) is a promising method for automatically...
research
08/06/2021

AceNAS: Learning to Rank Ace Neural Architectures with Weak Supervision of Weight Sharing

Architecture performance predictors have been widely used in neural arch...
research
07/03/2022

Architecture Augmentation for Performance Predictor Based on Graph Isomorphism

Neural Architecture Search (NAS) can automatically design architectures ...
research
10/02/2022

Siamese-NAS: Using Trained Samples Efficiently to Find Lightweight Neural Architecture by Prior Knowledge

In the past decade, many architectures of convolution neural networks we...
research
11/30/2022

GENNAPE: Towards Generalized Neural Architecture Performance Estimators

Predicting neural architecture performance is a challenging task and is ...

Please sign up or login with your details

Forgot password? Click here to reset