DrNAS: Dirichlet Neural Architecture Search

06/18/2020
by   Xiangning Chen, et al.
0

This paper proposes a novel differentiable architecture search method by formulating it into a distribution learning problem. We treat the continuously relaxed architecture mixing weight as random variables, modeled by Dirichlet distribution. With recently developed pathwise derivatives, the Dirichlet parameters can be easily optimized with gradient-based optimizer in an end-to-end manner. This formulation improves the generalization ability and induces stochasticity that naturally encourages exploration in the search space. Furthermore, to alleviate the large memory consumption of differentiable NAS, we propose a simple yet effective progressive learning scheme that enables searching directly on large-scale tasks, eliminating the gap between search and evaluation phases. Extensive experiments demonstrate the effectiveness of our method. Specifically, we obtain a test error of 2.46 ImageNet under the mobile setting. On NAS-Bench-201, we also achieve state-of-the-art results on all three datasets and provide insights for the effective design of neural architecture search algorithms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/25/2021

BaLeNAS: Differentiable Architecture Search via the Bayesian Learning Rule

Differentiable Architecture Search (DARTS) has received massive attentio...
research
10/26/2022

PredNAS: A Universal and Sample Efficient Neural Architecture Search Framework

In this paper, we present a general and effective framework for Neural A...
research
06/12/2023

Small Temperature is All You Need for Differentiable Architecture Search

Differentiable architecture search (DARTS) yields highly efficient gradi...
research
06/12/2023

Robustifying DARTS by Eliminating Information Bypass Leakage via Explicit Sparse Regularization

Differentiable architecture search (DARTS) is a promising end to end NAS...
research
05/18/2019

Multinomial Distribution Learning for Effective Neural Architecture Search

Architectures obtained by Neural Architecture Search (NAS) have achieved...
research
11/23/2020

ROME: Robustifying Memory-Efficient NAS via Topology Disentanglement and Gradients Accumulation

Single-path based differentiable neural architecture search has great st...
research
07/12/2019

Deep Model Compression via Filter Auto-sampling

The recent WSNet [1] is a new model compression method through sampling ...

Please sign up or login with your details

Forgot password? Click here to reset