β-DARTS: Beta-Decay Regularization for Differentiable Architecture Search

03/03/2022
by   Peng Ye, et al.
6

Neural Architecture Search (NAS) has attracted increasingly more attention in recent years because of its capability to design deep neural networks automatically. Among them, differential NAS approaches such as DARTS, have gained popularity for the search efficiency. However, they suffer from two main issues, the weak robustness to the performance collapse and the poor generalization ability of the searched architectures. To solve these two problems, a simple-but-efficient regularization method, termed as Beta-Decay, is proposed to regularize the DARTS-based NAS searching process. Specifically, Beta-Decay regularization can impose constraints to keep the value and variance of activated architecture parameters from too large. Furthermore, we provide in-depth theoretical analysis on how it works and why it works. Experimental results on NAS-Bench-201 show that our proposed method can help to stabilize the searching process and makes the searched network more transferable across different datasets. In addition, our search scheme shows an outstanding property of being less dependent on training time and data. Comprehensive experiments on a variety of search spaces and datasets validate the effectiveness of the proposed method.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/16/2023

β-DARTS++: Bi-level Regularization for Proxy-robust Differentiable Architecture Search

Neural Architecture Search has attracted increasing attention in recent ...
research
06/12/2021

Zero-Cost Proxies Meet Differentiable Architecture Search

Differentiable neural architecture search (NAS) has attracted significan...
research
09/05/2021

NAS-OoD: Neural Architecture Search for Out-of-Distribution Generalization

Recent advances on Out-of-Distribution (OoD) generalization reveal the r...
research
09/02/2021

NASI: Label- and Data-agnostic Neural Architecture Search at Initialization

Recent years have witnessed a surging interest in Neural Architecture Se...
research
10/14/2022

Λ-DARTS: Mitigating Performance Collapse by Harmonizing Operation Selection among Cells

Differentiable neural architecture search (DARTS) is a popular method fo...
research
05/06/2019

Differentiable Architecture Search with Ensemble Gumbel-Softmax

For network architecture search (NAS), it is crucial but challenging to ...
research
05/21/2021

A General Method For Automatic Discovery of Powerful Interactions In Click-Through Rate Prediction

Modeling powerful interactions is a critical challenge in Click-through ...

Please sign up or login with your details

Forgot password? Click here to reset