Stabilizing DARTS with Amended Gradient Estimation on Architectural Parameters

10/25/2019
by   Kaifeng Bi, et al.
0

Differentiable neural architecture search has been a popular methodology of exploring architectures for deep learning. Despite the great advantage of search efficiency, it often suffers weak stability, which obstacles it from being applied to a large search space or being flexibly adjusted to different scenarios. This paper investigates DARTS, the currently most popular differentiable search algorithm, and points out an important factor of instability, which lies in its approximation on the gradients of architectural parameters. In the current status, the optimization algorithm can converge to another point which results in dramatic inaccuracy in the re-training process. Based on this analysis, we propose an amending term for computing architectural gradients by making use of a direct property of the optimality of network parameter optimization. Our approach mathematically guarantees that gradient estimation follows a roughly correct direction, which leads the search stage to converge on reasonable architectures. In practice, our algorithm is easily implemented and added to DARTS-based approaches efficiently. Experiments on CIFAR and ImageNet demonstrate that our approach enjoys accuracy gain and, more importantly, enables DARTS-based approaches to explore much larger search spaces that have not been studied before.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/16/2019

UNAS: Differentiable Architecture Search Meets Reinforcement Learning

Neural architecture search (NAS) aims to discover network architectures ...
research
07/06/2023

LISSNAS: Locality-based Iterative Search Space Shrinkage for Neural Architecture Search

Search spaces hallmark the advancement of Neural Architecture Search (NA...
research
10/10/2021

ZARTS: On Zero-order Optimization for Neural Architecture Search

Differentiable architecture search (DARTS) has been a popular one-shot p...
research
04/07/2021

Partially-Connected Differentiable Architecture Search for Deepfake and Spoofing Detection

This paper reports the first successful application of a differentiable ...
research
12/16/2020

Distilling Optimal Neural Networks: Rapid Search in Diverse Spaces

This work presents DONNA (Distilling Optimal Neural Network Architecture...
research
04/19/2019

Assessing Architectural Similarity in Populations of Deep Neural Networks

Evolutionary deep intelligence has recently shown great promise for prod...
research
09/02/2020

DARTS-: Robustly Stepping out of Performance Collapse Without Indicators

Despite the fast development of differentiable architecture search (DART...

Please sign up or login with your details

Forgot password? Click here to reset