AceNAS: Learning to Rank Ace Neural Architectures with Weak Supervision of Weight Sharing

08/06/2021
by   Yuge Zhang, et al.
17

Architecture performance predictors have been widely used in neural architecture search (NAS). Although they are shown to be simple and effective, the optimization objectives in previous arts (e.g., precise accuracy estimation or perfect ranking of all architectures in the space) did not capture the ranking nature of NAS. In addition, a large number of ground-truth architecture-accuracy pairs are usually required to build a reliable predictor, making the process too computationally expensive. To overcome these, in this paper, we look at NAS from a novel point of view and introduce Learning to Rank (LTR) methods to select the best (ace) architectures from a space. Specifically, we propose to use Normalized Discounted Cumulative Gain (NDCG) as the target metric and LambdaRank as the training algorithm. We also propose to leverage weak supervision from weight sharing by pretraining architecture representation on weak labels obtained from the super-net and then finetuning the ranking model using a small number of architectures trained from scratch. Extensive experiments on NAS benchmarks and large-scale search spaces demonstrate that our approach outperforms SOTA with a significantly reduced search cost.

READ FULL TEXT
research
04/12/2021

Landmark Regularization: Ranking Guided Super-Net Training in Neural Architecture Search

Weight sharing has become a de facto standard in neural architecture sea...
research
07/03/2022

Architecture Augmentation for Performance Predictor Based on Graph Isomorphism

Neural Architecture Search (NAS) can automatically design architectures ...
research
02/21/2021

Weak NAS Predictors Are All You Need

Neural Architecture Search (NAS) finds the best network architecture by ...
research
09/15/2021

RankNAS: Efficient Neural Architecture Search by Pairwise Ranking

This paper addresses the efficiency challenge of Neural Architecture Sea...
research
03/18/2023

Weight-sharing Supernet for Searching Specialized Acoustic Event Classification Networks Across Device Constraints

Acoustic Event Classification (AEC) has been widely used in devices such...
research
10/04/2021

An Analysis of Super-Net Heuristics in Weight-Sharing NAS

Weight sharing promises to make neural architecture search (NAS) tractab...
research
07/16/2022

CLOSE: Curriculum Learning On the Sharing Extent Towards Better One-shot NAS

One-shot Neural Architecture Search (NAS) has been widely used to discov...

Please sign up or login with your details

Forgot password? Click here to reset