Multinomial Distribution Learning for Effective Neural Architecture Search

05/18/2019
by   Xiawu Zheng, et al.
0

Architectures obtained by Neural Architecture Search (NAS) have achieved highly competitive performance in various computer vision tasks. However, the prohibitive computation demand of forward-backward propagation in deep neural networks and searching algorithms makes it difficult to apply NAS in practice. In this paper, we propose a Multinomial Distribution Learning for extremely effective NAS, which considers the search space as a joint multinomial distribution, i.e., the operation between two nodes is sampled from this distribution, and the optimal network structure is obtained by the operations with the most likely probability in this distribution. Therefore, NAS can be transformed to a multinomial distribution learning problem, i.e., the distribution is optimized to have high expectation of the performance. Besides, a hypothesis that the performance ranking is consistent in every training epoch is proposed and demonstrated to further accelerate the learning process. Experiments on CIFAR-10 and ImageNet demonstrate the effectiveness of our method. On CIFAR-10, the structure searched by our method achieves 2.4% test error, while being 6.0 × (only 4 GPU hours on GTX1080Ti) faster compared with state-of-the-art NAS algorithms. On ImageNet, our model achieves 75.2% top-1 accuracy under MobileNet settings (MobileNet V1/V2), while being 1.2× faster with measured GPU latency. Test code is available at https://github.com/tanglang96/MDENAS

READ FULL TEXT
research
05/28/2019

Dynamic Distribution Pruning for Efficient Network Architecture Search

Network architectures obtained by Neural Architecture Search (NAS) have ...
research
12/02/2018

ProxylessNAS: Direct Neural Architecture Search on Target Task and Hardware

Neural architecture search (NAS) has a great impact by automatically des...
research
10/10/2019

Searching for A Robust Neural Architecture in Four GPU Hours

Conventional neural architecture search (NAS) approaches are based on re...
research
09/14/2020

RelativeNAS: Relative Neural Architecture Search via Slow-Fast Learning

Despite the remarkable successes of Convolutional Neural Networks (CNNs)...
research
09/08/2020

Binarized Neural Architecture Search for Efficient Object Recognition

Traditional neural architecture search (NAS) has a significant impact in...
research
04/04/2020

Neural Architecture Search for Lightweight Non-Local Networks

Non-Local (NL) blocks have been widely studied in various vision tasks. ...
research
06/18/2020

DrNAS: Dirichlet Neural Architecture Search

This paper proposes a novel differentiable architecture search method by...

Please sign up or login with your details

Forgot password? Click here to reset