Robustifying DARTS by Eliminating Information Bypass Leakage via Explicit Sparse Regularization

06/12/2023
by   Jiuling Zhang, et al.
0

Differentiable architecture search (DARTS) is a promising end to end NAS method which directly optimizes the architecture parameters through general gradient descent. However, DARTS is brittle to the catastrophic failure incurred by the skip connection in the search space. Recent studies also cast doubt on the basic underlying hypotheses of DARTS which are argued to be inherently prone to the performance discrepancy between the continuous-relaxed supernet in the training phase and the discretized finalnet in the evaluation phase. We figure out that the robustness problem and the skepticism can both be explained by the information bypass leakage during the training of the supernet. This naturally highlights the vital role of the sparsity of architecture parameters in the training phase which has not been well developed in the past. We thus propose a novel sparse-regularized approximation and an efficient mixed-sparsity training scheme to robustify DARTS by eliminating the information bypass leakage. We subsequently conduct extensive experiments on multiple search spaces to demonstrate the effectiveness of our method.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/12/2023

Small Temperature is All You Need for Differentiable Architecture Search

Differentiable architecture search (DARTS) yields highly efficient gradi...
research
06/18/2020

DrNAS: Dirichlet Neural Architecture Search

This paper proposes a novel differentiable architecture search method by...
research
10/10/2021

ZARTS: On Zero-order Optimization for Neural Architecture Search

Differentiable architecture search (DARTS) has been a popular one-shot p...
research
05/06/2019

Differentiable Architecture Search with Ensemble Gumbel-Softmax

For network architecture search (NAS), it is crucial but challenging to ...
research
03/31/2020

MTL-NAS: Task-Agnostic Neural Architecture Search towards General-Purpose Multi-Task Learning

We propose to incorporate neural architecture search (NAS) into general-...

Please sign up or login with your details

Forgot password? Click here to reset