ISTA-NAS: Efficient and Consistent Neural Architecture Search by Sparse Coding

10/13/2020
by   Yibo Yang, et al.
0

Neural architecture search (NAS) aims to produce the optimal sparse solution from a high-dimensional space spanned by all candidate connections. Current gradient-based NAS methods commonly ignore the constraint of sparsity in the search phase, but project the optimized solution onto a sparse one by post-processing. As a result, the dense super-net for search is inefficient to train and has a gap with the projected architecture for evaluation. In this paper, we formulate neural architecture search as a sparse coding problem. We perform the differentiable search on a compressed lower-dimensional space that has the same validation loss as the original sparse solution space, and recover an architecture by solving the sparse coding problem. The differentiable search and architecture recovery are optimized in an alternate manner. By doing so, our network for search at each update satisfies the sparsity constraint and is efficient to train. In order to also eliminate the depth and width gap between the network in search and the target-net in evaluation, we further propose a method to search and evaluate in one stage under the target-net settings. When training finishes, architecture variables are absorbed into network weights. Thus we get the searched architecture and optimized parameters in a single run. In experiments, our two-stage method on CIFAR-10 requires only 0.05 GPU-day for search. Our one-stage method produces state-of-the-art performances on both CIFAR-10 and ImageNet at the cost of only evaluation time.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/11/2023

Improving Differentiable Architecture Search via Self-Distillation

Differentiable Architecture Search (DARTS) is a simple yet efficient Neu...
research
01/27/2021

Towards Improving the Consistency, Efficiency, and Flexibility of Differentiable Neural Architecture Search

Most differentiable neural architecture search methods construct a super...
research
10/25/2022

NAS-PRNet: Neural Architecture Search generated Phase Retrieval Net for Off-axis Quantitative Phase Imaging

Single neural networks have achieved simultaneous phase retrieval with a...
research
05/13/2019

BayesNAS: A Bayesian Approach for Neural Architecture Search

One-Shot Neural Architecture Search (NAS) is a promising method to signi...
research
07/31/2020

Neural Architecture Search as Sparse Supernet

This paper aims at enlarging the problem of Neural Architecture Search f...
research
04/29/2021

Generalization Guarantees for Neural Architecture Search with Train-Validation Split

Neural Architecture Search (NAS) is a popular method for automatically d...
research
08/25/2021

iDARTS: Improving DARTS by Node Normalization and Decorrelation Discretization

Differentiable ARchiTecture Search (DARTS) uses a continuous relaxation ...

Please sign up or login with your details

Forgot password? Click here to reset