Multi-Predict: Few Shot Predictors For Efficient Neural Architecture Search

06/04/2023
by   Yash Akhauri, et al.
0

Many hardware-aware neural architecture search (NAS) methods have been developed to optimize the topology of neural networks (NN) with the joint objectives of higher accuracy and lower latency. Recently, both accuracy and latency predictors have been used in NAS with great success, achieving high sample efficiency and accurate modeling of hardware (HW) device latency respectively. However, a new accuracy predictor needs to be trained for every new NAS search space or NN task, and a new latency predictor needs to be additionally trained for every new HW device. In this paper, we explore methods to enable multi-task, multi-search-space, and multi-HW adaptation of accuracy and latency predictors to reduce the cost of NAS. We introduce a novel search-space independent NN encoding based on zero-cost proxies that achieves sample-efficient prediction on multiple tasks and NAS search spaces, improving the end-to-end sample efficiency of latency and accuracy predictors by over an order of magnitude in multiple scenarios. For example, our NN encoding enables multi-search-space transfer of latency predictors from NASBench-201 to FBNet (and vice-versa) in under 85 HW measurements, a 400× improvement in sample efficiency compared to a recent meta-learning approach. Our method also improves the total sample efficiency of accuracy predictors by over an order of magnitude. Finally, we demonstrate the effectiveness of our method for multi-search-space and multi-task accuracy prediction on 28 NAS search spaces and tasks.

READ FULL TEXT

page 9

page 17

page 18

page 21

research
07/09/2020

Neural Architecture Search with GBDT

Neural architecture search (NAS) with an accuracy predictor that predict...
research
07/16/2020

BRP-NAS: Prediction-based NAS using GCNs

Neural architecture search (NAS) enables researchers to automatically ex...
research
01/04/2021

Generalized Latency Performance Estimation for Once-For-All Neural Architecture Search

Neural Architecture Search (NAS) has enabled the possibility of automate...
research
10/06/2022

Inference Latency Prediction at the Edge

With the growing workload of inference tasks on mobile devices, state-of...
research
07/18/2023

An Evaluation of Zero-Cost Proxies – from Neural Architecture Performance to Model Robustness

Zero-cost proxies are nowadays frequently studied and used to search for...
research
06/16/2021

HELP: Hardware-Adaptive Efficient Latency Predictor for NAS via Meta-Learning

For deployment, neural architecture search should be hardware-aware, in ...
research
11/30/2022

AIO-P: Expanding Neural Performance Predictors Beyond Image Classification

Evaluating neural network performance is critical to deep neural network...

Please sign up or login with your details

Forgot password? Click here to reset