An Analysis of Super-Net Heuristics in Weight-Sharing NAS

10/04/2021
by   Kaicheng Yu, et al.
0

Weight sharing promises to make neural architecture search (NAS) tractable even on commodity hardware. Existing methods in this space rely on a diverse set of heuristics to design and train the shared-weight backbone network, a.k.a. the super-net. Since heuristics substantially vary across different methods and have not been carefully studied, it is unclear to which extent they impact super-net training and hence the weight-sharing NAS algorithms. In this paper, we disentangle super-net training from the search algorithm, isolate 14 frequently-used training heuristics, and evaluate them over three benchmark search spaces. Our analysis uncovers that several commonly-used heuristics negatively impact the correlation between super-net and stand-alone performance, whereas simple, but often overlooked factors, such as proper hyper-parameter settings, are key to achieve strong performance. Equipped with this knowledge, we show that simple random search achieves competitive performance to complex state-of-the-art NAS algorithms when the super-net is properly trained.

READ FULL TEXT
research
03/09/2020

How to Train Your Super-Net: An Analysis of Training Heuristics in Weight-Sharing NAS

Weight sharing promises to make neural architecture search (NAS) tractab...
research
01/06/2020

Deeper Insights into Weight Sharing in Neural Architecture Search

With the success of deep neural networks, Neural Architecture Search (NA...
research
04/12/2021

Landmark Regularization: Ranking Guided Super-Net Training in Neural Architecture Search

Weight sharing has become a de facto standard in neural architecture sea...
research
04/17/2020

Fitting the Search Space of Weight-sharing NAS with Graph Convolutional Networks

Neural architecture search has attracted wide attentions in both academi...
research
08/04/2020

Weight-Sharing Neural Architecture Search: A Battle to Shrink the Optimization Gap

Neural architecture search (NAS) has attracted increasing attentions in ...
research
02/03/2021

Learning Diverse-Structured Networks for Adversarial Robustness

In adversarial training (AT), the main focus has been the objective and ...
research
08/06/2021

AceNAS: Learning to Rank Ace Neural Architectures with Weak Supervision of Weight Sharing

Architecture performance predictors have been widely used in neural arch...

Please sign up or login with your details

Forgot password? Click here to reset