β-DARTS++: Bi-level Regularization for Proxy-robust Differentiable Architecture Search

01/16/2023
by   Peng Ye, et al.
0

Neural Architecture Search has attracted increasing attention in recent years. Among them, differential NAS approaches such as DARTS, have gained popularity for the search efficiency. However, they still suffer from three main issues, that are, the weak stability due to the performance collapse, the poor generalization ability of the searched architectures, and the inferior robustness to different kinds of proxies. To solve the stability and generalization problems, a simple-but-effective regularization method, termed as Beta-Decay, is proposed to regularize the DARTS-based NAS searching process (i.e., β-DARTS). Specifically, Beta-Decay regularization can impose constraints to keep the value and variance of activated architecture parameters from being too large, thereby ensuring fair competition among architecture parameters and making the supernet less sensitive to the impact of input on the operation set. In-depth theoretical analyses on how it works and why it works are provided. Comprehensive experiments validate that Beta-Decay regularization can help to stabilize the searching process and makes the searched network more transferable across different datasets. To address the robustness problem, we first benchmark different NAS methods under a wide range of proxy data, proxy channels, proxy layers and proxy epochs, since the robustness of NAS under different kinds of proxies has not been explored before. We then conclude some interesting findings and find that β-DARTS always achieves the best result among all compared NAS methods under almost all proxies. We further introduce the novel flooding regularization to the weight optimization of β-DARTS (i.e., Bi-level regularization), and experimentally and theoretically verify its effectiveness for improving the proxy robustness of differentiable NAS.

READ FULL TEXT
research
03/03/2022

β-DARTS: Beta-Decay Regularization for Differentiable Architecture Search

Neural Architecture Search (NAS) has attracted increasingly more attenti...
research
06/22/2021

Differentiable Architecture Search Without Training Nor Labels: A Pruning Perspective

With leveraging the weight-sharing and continuous relaxation to enable g...
research
11/21/2019

Data Proxy Generation for Fast and Efficient Neural Architecture Search

Due to the recent advances on Neural Architecture Search (NAS), it gains...
research
08/18/2021

Single-DARTS: Towards Stable Architecture Search

Differentiable architecture search (DARTS) marks a milestone in Neural A...
research
06/23/2020

NASTransfer: Analyzing Architecture Transferability in Large Scale Neural Architecture Search

Neural Architecture Search (NAS) is an open and challenging problem in m...
research
10/14/2022

Λ-DARTS: Mitigating Performance Collapse by Harmonizing Operation Selection among Cells

Differentiable neural architecture search (DARTS) is a popular method fo...
research
09/02/2020

Understanding the wiring evolution in differentiable neural architecture search

Controversy exists on whether differentiable neural architecture search ...

Please sign up or login with your details

Forgot password? Click here to reset