AtomNAS: Fine-Grained End-to-End Neural Architecture Search

12/20/2019
by   Jieru Mei, et al.
16

Designing of search space is a critical problem for neural architecture search (NAS) algorithms. We propose a fine-grained search space comprised of atomic blocks, a minimal search unit much smaller than the ones used in recent NAS algorithms. This search space facilitates direct selection of channel numbers and kernel sizes in convolutions. In addition, we propose a resource-aware architecture search algorithm which dynamically selects atomic blocks during training. The algorithm is further accelerated by a dynamic network shrinkage technique. Instead of a search-and-retrain two-stage paradigm, our method can simultaneously search and train the target architecture in an end-to-end manner. Our method achieves state-of-the-art performance under several FLOPS configurations on ImageNet with a negligible searching cost. We open our entire codebase at: https://github.com/meijieru/AtomNAS

READ FULL TEXT
research
06/17/2020

Fine-Grained Stochastic Architecture Search

State-of-the-art deep networks are often too large to deploy on mobile d...
research
04/16/2023

Canvas: End-to-End Kernel Architecture Search in Neural Networks

The demands for higher performance and accuracy in neural networks (NNs)...
research
11/18/2019

Fine-Grained Neural Architecture Search

We present an elegant framework of fine-grained neural architecture sear...
research
03/27/2019

Network Slimming by Slimmable Networks: Towards One-Shot Architecture Search for Channel Numbers

We study how to set channel numbers in a neural network to achieve bette...
research
04/04/2020

Neural Architecture Search for Lightweight Non-Local Networks

Non-Local (NL) blocks have been widely studied in various vision tasks. ...
research
12/02/2019

GroSS: Group-Size Series Decomposition for Whole Search-Space Training

We present Group-size Series (GroSS) decomposition, a mathematical formu...
research
02/03/2021

Learning Diverse-Structured Networks for Adversarial Robustness

In adversarial training (AT), the main focus has been the objective and ...

Please sign up or login with your details

Forgot password? Click here to reset