TND-NAS: Towards Non-differentiable Objectives in Progressive Differentiable NAS Framework

11/06/2021
by   Bo Lyu, et al.
0

Differentiable architecture search has gradually become the mainstream research topic in the field of Neural Architecture Search (NAS) for its capability to improve efficiency compared with the early NAS (EA-based, RL-based) methods. Recent differentiable NAS also aims at further improving search efficiency, reducing the GPU-memory consumption, and addressing the "depth gap" issue. However, these methods are no longer capable of tackling the non-differentiable objectives, let alone multi-objectives, e.g., performance, robustness, efficiency, and other metrics. We propose an end-to-end architecture search framework towards non-differentiable objectives, TND-NAS, with the merits of the high efficiency in differentiable NAS framework and the compatibility among non-differentiable metrics in Multi-objective NAS (MNAS). Under differentiable NAS framework, with the continuous relaxation of the search space, TND-NAS has the architecture parameters (α) been optimized in discrete space, while resorting to the search policy of progressively shrinking the supernetwork by α. Our representative experiment takes two objectives (Parameters, Accuracy) as an example, we achieve a series of high-performance compact architectures on CIFAR10 (1.09M/3.3 9.57M/2.54 Favorably, under real-world scenarios (resource-constrained, platform-specialized), the Pareto-optimal solutions can be conveniently reached by TND-NAS.

READ FULL TEXT
research
07/07/2020

GOLD-NAS: Gradual, One-Level, Differentiable

There has been a large literature of neural architecture search, but mos...
research
05/06/2019

Differentiable Architecture Search with Ensemble Gumbel-Softmax

For network architecture search (NAS), it is crucial but challenging to ...
research
01/23/2020

Multi-objective Neural Architecture Search via Non-stationary Policy Gradient

Multi-objective Neural Architecture Search (NAS) aims to discover novel ...
research
06/22/2021

On Constrained Optimization in Differentiable Neural Architecture Search

Differentiable Architecture Search (DARTS) is a recently proposed neural...
research
01/01/2021

Neural Architecture Search via Combinatorial Multi-Armed Bandit

Neural Architecture Search (NAS) has gained significant popularity as an...
research
02/21/2020

DSNAS: Direct Neural Architecture Search without Parameter Retraining

If NAS methods are solutions, what is the problem? Most existing NAS met...

Please sign up or login with your details

Forgot password? Click here to reset