Progressive DARTS: Bridging the Optimization Gap for NAS in the Wild

12/23/2019
by   Xin Chen, et al.
0

With the rapid development of neural architecture search (NAS), researchers found powerful network architectures for a wide range of vision tasks. However, it remains unclear if the searched architecture can transfer across different types of tasks as manually designed ones did. This paper puts forward this problem, referred to as NAS in the wild, which explores the possibility of finding the optimal architecture in a proxy dataset and then deploying it to mostly unseen scenarios. We instantiate this setting using a currently popular algorithm named differentiable architecture search (DARTS), which often suffers unsatisfying performance while being transferred across different tasks. We argue that the accuracy drop originates from the formulation that uses a super-network for search but a sub-network for re-training. The different properties of these stages have resulted in a significant optimization gap, and consequently, the architectural parameters "over-fit" the super-network. To alleviate the gap, we present a progressive method that gradually increases the network depth during the search stage, which leads to the Progressive DARTS (P-DARTS) algorithm. With a reduced search cost (7 hours on a single GPU), P-DARTS achieves improved performance on both the proxy dataset (CIFAR10) and a few target problems (ImageNet classification, COCO detection and three ReID benchmarks). Our code is available at <https://github.com/chenxin061/pdarts>.

READ FULL TEXT
research
04/29/2019

Progressive Differentiable Architecture Search: Bridging the Depth Gap between Search and Evaluation

Recently, differentiable search methods have made major progress in redu...
research
06/09/2021

Accelerating Neural Architecture Search via Proxy Data

Despite the increasing interest in neural architecture search (NAS), the...
research
12/02/2018

ProxylessNAS: Direct Neural Architecture Search on Target Task and Hardware

Neural architecture search (NAS) has a great impact by automatically des...
research
03/17/2022

Progressive Subsampling for Oversampled Data – Application to Quantitative MRI

We present PROSUB: PROgressive SUBsampling, a deep learning based, autom...
research
08/26/2019

Once for All: Train One Network and Specialize it for Efficient Deployment

Efficient deployment of deep learning models requires specialized neural...
research
07/01/2019

Single-Path Mobile AutoML: Efficient ConvNet Design and NAS Hyperparameter Optimization

Can we reduce the search cost of Neural Architecture Search (NAS) from d...
research
03/07/2021

Auto-tuning of Deep Neural Networks by Conflicting Layer Removal

Designing neural network architectures is a challenging task and knowing...

Please sign up or login with your details

Forgot password? Click here to reset