Differentiable Architecture Search Without Training Nor Labels: A Pruning Perspective

06/22/2021
by   Miao Zhang, et al.
13

With leveraging the weight-sharing and continuous relaxation to enable gradient-descent to alternately optimize the supernet weights and the architecture parameters through a bi-level optimization paradigm, Differentiable ARchiTecture Search (DARTS) has become the mainstream method in Neural Architecture Search (NAS) due to its simplicity and efficiency. However, more recent works found that the performance of the searched architecture barely increases with the optimization proceeding in DARTS. In addition, several concurrent works show that the NAS could find more competitive architectures without labels. The above observations reveal that the supervision signal in DARTS may be a poor indicator for architecture optimization, inspiring a foundational question: instead of using the supervision signal to perform bi-level optimization, can we find high-quality architectures without any training nor labels? We provide an affirmative answer by customizing the NAS as a network pruning at initialization problem. By leveraging recent techniques on the network pruning at initialization, we designed a FreeFlow proxy to score the importance of candidate operations in NAS without any training nor labels, and proposed a novel framework called training and label free neural architecture search (FreeNAS) accordingly. We show that, without any training nor labels, FreeNAS with the proposed FreeFlow proxy can outperform most NAS baselines. More importantly, our framework is extremely efficient, which completes the architecture search within only 3.6s and 79s on a single GPU for the NAS-Bench-201 and DARTS search space, respectively. We hope our work inspires more attempts in solving NAS from the perspective of pruning at initialization.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/25/2021

BaLeNAS: Differentiable Architecture Search via the Bayesian Learning Rule

Differentiable Architecture Search (DARTS) has received massive attentio...
research
04/21/2021

Making Differentiable Architecture Search less local

Neural architecture search (NAS) is a recent methodology for automating ...
research
10/10/2021

ZARTS: On Zero-order Optimization for Neural Architecture Search

Differentiable architecture search (DARTS) has been a popular one-shot p...
research
03/26/2020

Are Labels Necessary for Neural Architecture Search?

Existing neural network architectures in computer vision — whether desig...
research
06/02/2022

Pruning-as-Search: Efficient Neural Architecture Search via Channel Pruning and Structural Reparameterization

Neural architecture search (NAS) and network pruning are widely studied ...
research
01/16/2023

β-DARTS++: Bi-level Regularization for Proxy-robust Differentiable Architecture Search

Neural Architecture Search has attracted increasing attention in recent ...
research
10/23/2021

Towards a Robust Differentiable Architecture Search under Label Noise

Neural Architecture Search (NAS) is the game changer in designing robust...

Please sign up or login with your details

Forgot password? Click here to reset