Zero-Cost Proxies for Lightweight NAS

01/20/2021
by   Mohamed S. Abdelfattah, et al.
14

Neural Architecture Search (NAS) is quickly becoming the standard methodology to design neural network models. However, NAS is typically compute-intensive because multiple models need to be evaluated before choosing the best one. To reduce the computational power and time needed, a proxy task is often used for evaluating each model instead of full training. In this paper, we evaluate conventional reduced-training proxies and quantify how well they preserve ranking between multiple models during search when compared with the rankings produced by final trained accuracy. We propose a series of zero-cost proxies, based on recent pruning literature, that use just a single minibatch of training data to compute a model's score. Our zero-cost proxies use 3 orders of magnitude less computation but can match and even outperform conventional proxies. For example, Spearman's rank correlation coefficient between final validation accuracy and our best zero-cost proxy on NAS-Bench-201 is 0.82, compared to 0.61 for EcoNAS (a recently proposed reduced-training proxy). Finally, we use these zero-cost proxies to enhance existing NAS search algorithms such as random search, reinforcement learning, evolutionary search and predictor-based search. For all search methodologies and across three different NAS datasets, we are able to significantly improve sample efficiency, and thereby decrease computation, by using our zero-cost proxies. For example on NAS-Bench-101, we achieved the same accuracy 4× quicker than the best previous result. Our code is made public at: https://github.com/mohsaied/zero-cost-nas.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/09/2021

Accelerating Neural Architecture Search via Proxy Data

Despite the increasing interest in neural architecture search (NAS), the...
research
11/21/2019

Data Proxy Generation for Fast and Efficient Neural Architecture Search

Due to the recent advances on Neural Architecture Search (NAS), it gains...
research
09/15/2022

EZNAS: Evolving Zero Cost Proxies For Neural Architecture Scoring

Neural Architecture Search (NAS) has significantly improved productivity...
research
10/06/2022

NAS-Bench-Suite-Zero: Accelerating Research on Zero Cost Proxies

Zero-cost proxies (ZC proxies) are a recent architecture performance pre...
research
07/20/2020

NSGANetV2: Evolutionary Multi-Objective Surrogate-Assisted Neural Architecture Search

In this paper, we propose an efficient NAS algorithm for generating task...
research
02/09/2022

Model Architecture Adaption for Bayesian Neural Networks

Bayesian Neural Networks (BNNs) offer a mathematically grounded framewor...
research
03/14/2022

Less is More: Proxy Datasets in NAS approaches

Neural Architecture Search (NAS) defines the design of Neural Networks a...

Please sign up or login with your details

Forgot password? Click here to reset