FP-NAS: Fast Probabilistic Neural Architecture Search

11/22/2020
by   Zhicheng Yan, et al.
4

Differential Neural Architecture Search (NAS) requires all layer choices to be held in memory simultaneously; this limits the size of both search space and final architecture. In contrast, Probabilistic NAS, such as PARSEC, learns a distribution over high-performing architectures, and uses only as much memory as needed to train a single model. Nevertheless, it needs to sample many architectures, making it computationally expensive for searching in an extensive space. To solve these problems, we propose a sampling method adaptive to the distribution entropy, drawing more samples to encourage explorations at the beginning, and reducing samples as learning proceeds. Furthermore, to search fast in the multi-variate space, we propose a coarse-to-fine strategy by using a factorized distribution at the beginning which can reduce the number of architecture parameters by over an order of magnitude.We call this method Fast Probabilistic NAS (FP-NAS). Compared with PARSEC, it can sample 64 architectures and search 2.1x faster. Compared with FBNetV2, FP-NAS is 1.9x - 3.6x faster, and the searched models outperform FBNetV2 models on ImageNet. FP-NAS allows us to expand the giant FBNetV2 space to be wider (i.e. larger channel choices) and deeper (i.e. more blocks), while adding Split-Attention block and enabling the search over the number of splits. When searching a model of size 0.4G FLOPS, FP-NAS is 132x faster than EfficientNet, and the searched FP-NAS-L0 model outperforms EfficientNet-B0 by 0.6 architecture surrogate or scaling tricks, we directly search large models up to 1.0G FLOPS. Our FP-NAS-L2 model with simple distillation outperforms BigNAS-XL with advanced inplace distillation by 0.7

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/25/2019

Binarized Neural Architecture Search

Neural architecture search (NAS) can have a significant impact in comput...
research
02/13/2019

Probabilistic Neural Architecture Search

In neural architecture search (NAS), the space of neural network archite...
research
11/18/2020

AttentiveNAS: Improving Neural Architecture Search via Attentive Sampling

Neural architecture search (NAS) has shown great promise designing state...
research
10/13/2021

CONetV2: Efficient Auto-Channel Size Optimization for CNNs

Neural Architecture Search (NAS) has been pivotal in finding optimal net...
research
04/20/2022

SpiderNet: Hybrid Differentiable-Evolutionary Architecture Search via Train-Free Metrics

Neural Architecture Search (NAS) algorithms are intended to remove the b...
research
09/08/2021

RepNAS: Searching for Efficient Re-parameterizing Blocks

In the past years, significant improvements in the field of neural archi...
research
03/16/2022

Learning Where To Look – Generative NAS is Surprisingly Efficient

The efficient, automated search for well-performing neural architectures...

Please sign up or login with your details

Forgot password? Click here to reset