ROME: Robustifying Memory-Efficient NAS via Topology Disentanglement and Gradients Accumulation

11/23/2020
by   Xiaoxing Wang, et al.
0

Single-path based differentiable neural architecture search has great strengths for its low computational cost and memory-friendly nature. However, we surprisingly discover that it suffers from severe searching instability which has been primarily ignored, posing a potential weakness for a wider application. In this paper, we delve into its performance collapse issue and propose a new algorithm called RObustifying Memory-Efficient NAS (ROME). Specifically, 1) for consistent topology in the search and evaluation stage, we involve separate parameters to disentangle the topology from the operations of the architecture. In such a way, we can independently sample connections and operations without interference; 2) to discount sampling unfairness and variance, we enforce fair sampling for weight update and apply a gradient accumulation mechanism for architecture parameters. Extensive experiments demonstrate that our proposed method has strong performance and robustness, where it mostly achieves state-of-the-art results on a large number of standard benchmarks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/27/2023

DCP-NAS: Discrepant Child-Parent Neural Architecture Search for 1-bit CNNs

Neural architecture search (NAS) proves to be among the effective approa...
research
09/18/2020

Faster Gradient-based NAS Pipeline Combining Broad Scalable Architecture with Confident Learning Rate

In order to further improve the search efficiency of Neural Architecture...
research
07/22/2019

Efficient Novelty-Driven Neural Architecture Search

One-Shot Neural architecture search (NAS) attracts broad attention recen...
research
11/18/2020

Explicitly Learning Topology for Differentiable Neural Architecture Search

Differentiable neural architecture search (DARTS) has gained much succes...
research
06/18/2020

DrNAS: Dirichlet Neural Architecture Search

This paper proposes a novel differentiable architecture search method by...
research
09/02/2020

Understanding the wiring evolution in differentiable neural architecture search

Controversy exists on whether differentiable neural architecture search ...
research
09/02/2020

DARTS-: Robustly Stepping out of Performance Collapse Without Indicators

Despite the fast development of differentiable architecture search (DART...

Please sign up or login with your details

Forgot password? Click here to reset