DARTS-: Robustly Stepping out of Performance Collapse Without Indicators

09/02/2020
by   Xiangxiang Chu, et al.
0

Despite the fast development of differentiable architecture search (DARTS), it suffers from a standing instability issue regarding searching performance, which extremely limits its application. Existing robustifying methods draw clues from the outcome instead of finding out the causing factor. Various indicators such as Hessian eigenvalues are proposed as a signal of performance collapse, and the searching should be stopped once an indicator reaches a preset threshold. However, these methods tend to easily reject good architectures if thresholds are inappropriately set, let alone the searching is intrinsically noisy. In this paper, we undertake a more subtle and direct approach to resolve the collapse. We first demonstrate that skip connections with a learnable architectural coefficient can easily recover from a disadvantageous state and become dominant. We conjecture that skip connections profit too much from this privilege, hence causing the collapse for the derived model. Therefore, we propose to factor out this benefit with an auxiliary skip connection, ensuring a fairer competition for all operations. Extensive experiments on various datasets verify that our approach can substantially improve the robustness of DARTS.

READ FULL TEXT
research
04/10/2022

Enhancing the Robustness, Efficiency, and Diversity of Differentiable Architecture Search

Differentiable architecture search (DARTS) has attracted much attention ...
research
05/07/2020

Noisy Differentiable Architecture Search

Simplicity is the ultimate sophistication. Differentiable Architecture S...
research
06/30/2020

Theory-Inspired Path-Regularized Differential Network Architecture Search

Despite its high search efficiency, differential architecture search (DA...
research
11/27/2019

Fair DARTS: Eliminating Unfair Advantages in Differentiable Architecture Search

Differential Architecture Search (DARTS) is now a widely disseminated we...
research
07/09/2021

Mutually-aware Sub-Graphs Differentiable Architecture Search

Differentiable architecture search is prevalent in the field of NAS beca...
research
11/23/2020

ROME: Robustifying Memory-Efficient NAS via Topology Disentanglement and Gradients Accumulation

Single-path based differentiable neural architecture search has great st...
research
10/25/2019

Stabilizing DARTS with Amended Gradient Estimation on Architectural Parameters

Differentiable neural architecture search has been a popular methodology...

Please sign up or login with your details

Forgot password? Click here to reset