BaLeNAS: Differentiable Architecture Search via the Bayesian Learning Rule

11/25/2021
by   Miao Zhang, et al.
1

Differentiable Architecture Search (DARTS) has received massive attention in recent years, mainly because it significantly reduces the computational cost through weight sharing and continuous relaxation. However, more recent works find that existing differentiable NAS techniques struggle to outperform naive baselines, yielding deteriorative architectures as the search proceeds. Rather than directly optimizing the architecture parameters, this paper formulates the neural architecture search as a distribution learning problem through relaxing the architecture weights into Gaussian distributions. By leveraging the natural-gradient variational inference (NGVI), the architecture distribution can be easily optimized based on existing codebases without incurring more memory and computational consumption. We demonstrate how the differentiable NAS benefits from Bayesian principles, enhancing exploration and improving stability. The experimental results on NAS-Bench-201 and NAS-Bench-1shot1 benchmark datasets confirm the significant improvements the proposed framework can make. In addition, instead of simply applying the argmax on the learned parameters, we further leverage the recently-proposed training-free proxies in NAS to select the optimal architecture from a group architectures drawn from the optimized distribution, where we achieve state-of-the-art results on the NAS-Bench-201 and NAS-Bench-1shot1 benchmarks. Our best architecture in the DARTS search space also obtains competitive test errors with 2.37%, 15.72%, and 24.2% on CIFAR-10, CIFAR-100, and ImageNet datasets, respectively.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/22/2021

Differentiable Architecture Search Without Training Nor Labels: A Pruning Perspective

With leveraging the weight-sharing and continuous relaxation to enable g...
research
08/20/2021

D-DARTS: Distributed Differentiable Architecture Search

Differentiable ARchiTecture Search (DARTS) is one of the most trending N...
research
06/18/2020

DrNAS: Dirichlet Neural Architecture Search

This paper proposes a novel differentiable architecture search method by...
research
11/30/2021

Improving Differentiable Architecture Search with a Generative Model

In differentiable neural architecture search (NAS) algorithms like DARTS...
research
08/18/2022

Differentiable Architecture Search with Random Features

Differentiable architecture search (DARTS) has significantly promoted th...
research
06/22/2021

On Constrained Optimization in Differentiable Neural Architecture Search

Differentiable Architecture Search (DARTS) is a recently proposed neural...
research
11/18/2022

α DARTS Once More: Enhancing Differentiable Architecture Search by Masked Image Modeling

Differentiable architecture search (DARTS) has been a mainstream directi...

Please sign up or login with your details

Forgot password? Click here to reset