Understanding and Improving One-shot Neural Architecture Optimization

09/24/2019
by   Renqian Luo, et al.
0

The ability of accurately ranking candidate architectures is the key to the performance of neural architecture search (NAS). One-shot NAS is proposed to cut the expense but shows inferior performance against conventional NAS and is not adequately stable. We find that the ranking correlation between architectures under one-shot training and the ones under stand-alone training is poor, which misleads the algorithm to discover better architectures. We conjecture that this is owing to the gaps between one-shot training and stand-alone complete training. In this work, we empirically investigate several main factors that lead to the gaps and so weak ranking correlation. We then propose NAO-V2 to alleviate such gaps where we: (1) Increase the average updates for individual architecture to a relatively adequate extent. (2) Encourage more updates for large and complex architectures than small and simple architectures to balance them by sampling architectures in proportion to their model sizes. (3) Make the one-shot training of the supernet independent at each iteration. Comprehensive experiments verify that our proposed method is effective and robust. It leads to a more stable search that all the top architectures perform well enough compared to baseline methods. The final discovered architecture shows significant improvements against baselines with a test error rate of 2.60 under the mobile setting. Code and model checkpoints are publicly available at https://github.com/renqianluo/NAO_pytorch.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/27/2022

Prior-Guided One-shot Neural Architecture Search

Neural architecture search methods seek optimal candidates with efficien...
research
08/07/2020

A Surgery of the Neural Architecture Evaluators

Neural architecture search (NAS) recently received extensive attention d...
research
06/13/2022

Improve Ranking Correlation of Super-net through Training Scheme from One-shot NAS to Few-shot NAS

The algorithms of one-shot neural architecture search(NAS) have been wid...
research
07/16/2022

CLOSE: Curriculum Learning On the Sharing Extent Towards Better One-shot NAS

One-shot Neural Architecture Search (NAS) has been widely used to discov...
research
01/24/2023

RD-NAS: Enhancing One-shot Supernet Ranking Ability via Ranking Distillation from Zero-cost Proxies

Neural architecture search (NAS) has made tremendous progress in the aut...
research
08/22/2021

Pi-NAS: Improving Neural Architecture Search by Reducing Supernet Training Consistency Shift

Recently proposed neural architecture search (NAS) methods co-train bill...
research
07/21/2022

Efficient Search of Multiple Neural Architectures with Different Complexities via Importance Sampling

Neural architecture search (NAS) aims to automate architecture design pr...

Please sign up or login with your details

Forgot password? Click here to reset