SNAS: Stochastic Neural Architecture Search

12/24/2018
by   Sirui Xie, et al.
0

We propose Stochastic Neural Architecture Search (SNAS), an economical end-to-end solution to Neural Architecture Search (NAS) that trains neural operation parameters and architecture distribution parameters in same round of back-propagation, while maintaining the completeness and differentiability of the NAS pipeline. In this work, NAS is reformulated as an optimization problem on parameters of a joint distribution for the search space in a cell. To leverage the gradient information in generic differentiable loss for architecture search, a novel search gradient is proposed. We prove that this search gradient optimizes the same objective as reinforcement-learning-based NAS, but assigns credits to structural decisions more efficiently. This credit assignment is further augmented with locally decomposable reward to enforce a resource-efficient constraint. In experiments on CIFAR-10, SNAS takes less epochs to find a cell architecture with state-of-the-art accuracy than non-differentiable evolution-based and reinforcement-learning-based NAS, which is also transferable to ImageNet. It is also shown that child networks of SNAS can maintain the validation accuracy in searching, with which attention-based NAS requires parameter retraining to compete, exhibiting potentials to stride towards efficient NAS on big datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/27/2020

Neural Architecture Search of SPD Manifold Networks

In this paper, we propose a new neural architecture search (NAS) problem...
research
12/16/2019

UNAS: Differentiable Architecture Search Meets Reinforcement Learning

Neural architecture search (NAS) aims to discover network architectures ...
research
12/11/2020

AdvantageNAS: Efficient Neural Architecture Search with Credit Assignment

Neural architecture search (NAS) is an approach for automatically design...
research
10/10/2019

Searching for A Robust Neural Architecture in Four GPU Hours

Conventional neural architecture search (NAS) approaches are based on re...
research
01/01/2021

Neural Architecture Search via Combinatorial Multi-Armed Bandit

Neural Architecture Search (NAS) has gained significant popularity as an...
research
09/25/2021

L^2NAS: Learning to Optimize Neural Architectures via Continuous-Action Reinforcement Learning

Neural architecture search (NAS) has achieved remarkable results in deep...
research
05/21/2019

Adaptive Stochastic Natural Gradient Method for One-Shot Neural Architecture Search

High sensitivity of neural architecture search (NAS) methods against the...

Please sign up or login with your details

Forgot password? Click here to reset