Zero-Cost Proxies for Lightweight NAS

by   Mohamed S. Abdelfattah, et al.

Neural Architecture Search (NAS) is quickly becoming the standard methodology to design neural network models. However, NAS is typically compute-intensive because multiple models need to be evaluated before choosing the best one. To reduce the computational power and time needed, a proxy task is often used for evaluating each model instead of full training. In this paper, we evaluate conventional reduced-training proxies and quantify how well they preserve ranking between multiple models during search when compared with the rankings produced by final trained accuracy. We propose a series of zero-cost proxies, based on recent pruning literature, that use just a single minibatch of training data to compute a model's score. Our zero-cost proxies use 3 orders of magnitude less computation but can match and even outperform conventional proxies. For example, Spearman's rank correlation coefficient between final validation accuracy and our best zero-cost proxy on NAS-Bench-201 is 0.82, compared to 0.61 for EcoNAS (a recently proposed reduced-training proxy). Finally, we use these zero-cost proxies to enhance existing NAS search algorithms such as random search, reinforcement learning, evolutionary search and predictor-based search. For all search methodologies and across three different NAS datasets, we are able to significantly improve sample efficiency, and thereby decrease computation, by using our zero-cost proxies. For example on NAS-Bench-101, we achieved the same accuracy 4× quicker than the best previous result. Our code is made public at:


page 1

page 2

page 3

page 4


Accelerating Neural Architecture Search via Proxy Data

Despite the increasing interest in neural architecture search (NAS), the...

Data Proxy Generation for Fast and Efficient Neural Architecture Search

Due to the recent advances on Neural Architecture Search (NAS), it gains...

EZNAS: Evolving Zero Cost Proxies For Neural Architecture Scoring

Neural Architecture Search (NAS) has significantly improved productivity...

NAS-Bench-Suite-Zero: Accelerating Research on Zero Cost Proxies

Zero-cost proxies (ZC proxies) are a recent architecture performance pre...

NSGANetV2: Evolutionary Multi-Objective Surrogate-Assisted Neural Architecture Search

In this paper, we propose an efficient NAS algorithm for generating task...

Model Architecture Adaption for Bayesian Neural Networks

Bayesian Neural Networks (BNNs) offer a mathematically grounded framewor...

Less is More: Proxy Datasets in NAS approaches

Neural Architecture Search (NAS) defines the design of Neural Networks a...

Please sign up or login with your details

Forgot password? Click here to reset