Dynamic Distribution Pruning for Efficient Network Architecture Search

05/28/2019
by   Xiawu Zheng, et al.
0

Network architectures obtained by Neural Architecture Search (NAS) have shown state-of-the-art performance in various computer vision tasks. Despite the exciting progress, the computational complexity of the forward-backward propagation and the search process makes it difficult to apply NAS in practice. In particular, most previous methods require thousands of GPU days for the search process to converge. In this paper, we propose a dynamic distribution pruning method towards extremely efficient NAS, which samples architectures from a joint categorical distribution. The search space is dynamically pruned every a few epochs to update this distribution, and the optimal neural architecture is obtained when there is only one structure remained. We conduct experiments on two widely-used datasets in NAS. On CIFAR-10, the optimal structure obtained by our method achieves the state-of-the-art 1.9% test error, while the search process is more than 1,000 times faster (only 1.5 GPU hours on a Tesla V100) than the state-of-the-art NAS algorithms. On ImageNet, our model achieves 75.2% top-1 accuracy under the MobileNet settings, with a time cost of only 2 GPU days that is 100% acceleration over the fastest NAS algorithm. The code is available at < https://github.com/tanglang96/DDPNAS>

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/18/2019

Multinomial Distribution Learning for Effective Neural Architecture Search

Architectures obtained by Neural Architecture Search (NAS) have achieved...
research
08/13/2021

EEEA-Net: An Early Exit Evolutionary Neural Architecture Search

The goals of this research were to search for Convolutional Neural Netwo...
research
09/14/2020

RelativeNAS: Relative Neural Architecture Search via Slow-Fast Learning

Despite the remarkable successes of Convolutional Neural Networks (CNNs)...
research
04/22/2019

Towards Learning of Filter-Level Heterogeneous Compression of Convolutional Neural Networks

Recently, deep learning has become a de facto standard in machine learni...
research
03/08/2022

Evolutionary Neural Cascade Search across Supernetworks

To achieve excellent performance with modern neural networks, having the...
research
05/13/2019

BayesNAS: A Bayesian Approach for Neural Architecture Search

One-Shot Neural Architecture Search (NAS) is a promising method to signi...
research
06/18/2019

Prune and Replace NAS

While recent NAS algorithms are thousands of times faster than the pione...

Please sign up or login with your details

Forgot password? Click here to reset