FBNetV2: Differentiable Neural Architecture Search for Spatial and Channel Dimensions

04/12/2020
by   Alvin Wan, et al.
17

Differentiable Neural Architecture Search (DNAS) has demonstrated great success in designing state-of-the-art, efficient neural networks. However, DARTS-based DNAS's search space is small when compared to other search methods', since all candidate network layers must be explicitly instantiated in memory. To address this bottleneck, we propose a memory and computationally efficient DNAS variant: DMaskingNAS. This algorithm expands the search space by up to 10^14× over conventional DNAS, supporting searches over spatial and channel dimensions that are otherwise prohibitively expensive: input resolution and number of filters. We propose a masking mechanism for feature map reuse, so that memory and computational costs stay nearly constant as the search space expands. Furthermore, we employ effective shape propagation to maximize per-FLOP or per-parameter accuracy. The searched FBNetV2s yield state-of-the-art performance when compared with all previous architectures. With up to 421× less search cost, DMaskingNAS finds models with 0.9 higher accuracy, 15 accuracy but 20 outperforms MobileNetV3 by 2.6 FBNetV2 models are open-sourced at https://github.com/facebookresearch/mobile-vision.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/13/2023

Flexible Channel Dimensions for Differentiable Architecture Search

Finding optimal channel dimensions (i.e., the number of filters in DNN l...
research
12/31/2022

Pseudo-Inverted Bottleneck Convolution for DARTS Search Space

Differentiable Architecture Search (DARTS) has attracted considerable at...
research
03/21/2021

MoViNets: Mobile Video Networks for Efficient Video Recognition

We present Mobile Video Networks (MoViNets), a family of computation and...
research
04/29/2019

Progressive Differentiable Architecture Search: Bridging the Depth Gap between Search and Evaluation

Recently, differentiable search methods have made major progress in redu...
research
07/05/2020

Rethinking Bottleneck Structure for Efficient Mobile Network Design

The inverted residual block is dominating architecture design for mobile...
research
09/13/2021

RADARS: Memory Efficient Reinforcement Learning Aided Differentiable Neural Architecture Search

Differentiable neural architecture search (DNAS) is known for its capaci...
research
04/04/2020

Neural Architecture Search for Lightweight Non-Local Networks

Non-Local (NL) blocks have been widely studied in various vision tasks. ...

Please sign up or login with your details

Forgot password? Click here to reset