TF-NAS: Rethinking Three Search Freedoms of Latency-Constrained Differentiable Neural Architecture Search

08/12/2020
by   Yibo Hu, et al.
0

With the flourish of differentiable neural architecture search (NAS), automatically searching latency-constrained architectures gives a new perspective to reduce human labor and expertise. However, the searched architectures are usually suboptimal in accuracy and may have large jitters around the target latency. In this paper, we rethink three freedoms of differentiable NAS, i.e. operation-level, depth-level and width-level, and propose a novel method, named Three-Freedom NAS (TF-NAS), to achieve both good classification accuracy and precise latency constraint. For the operation-level, we present a bi-sampling search algorithm to moderate the operation collapse. For the depth-level, we introduce a sink-connecting search space to ensure the mutual exclusion between skip and other candidate operations, as well as eliminate the architecture redundancy. For the width-level, we propose an elasticity-scaling strategy that achieves precise latency constraint in a progressively fine-grained manner. Experiments on ImageNet demonstrate the effectiveness of TF-NAS. Particularly, our searched TF-NAS-A obtains 76.9 less latency. The total search time is only 1.8 days on 1 Titan RTX GPU. Code is available at https://github.com/AberHu/TF-NAS.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/07/2020

GOLD-NAS: Gradual, One-Level, Differentiable

There has been a large literature of neural architecture search, but mos...
research
06/23/2019

Densely Connected Search Space for More Flexible Neural Architecture Search

In recent years, neural architecture search (NAS) has dramatically advan...
research
01/27/2022

DropNAS: Grouped Operation Dropout for Differentiable Architecture Search

Neural architecture search (NAS) has shown encouraging results in automa...
research
05/21/2020

AOWS: Adaptive and optimal network width search with latency constraints

Neural architecture search (NAS) approaches aim at automatically finding...
research
10/25/2022

NAS-PRNet: Neural Architecture Search generated Phase Retrieval Net for Off-axis Quantitative Phase Imaging

Single neural networks have achieved simultaneous phase retrieval with a...
research
03/30/2021

Differentiable Network Adaption with Elastic Search Space

In this paper we propose a novel network adaption method called Differen...
research
07/01/2019

Single-Path Mobile AutoML: Efficient ConvNet Design and NAS Hyperparameter Optimization

Can we reduce the search cost of Neural Architecture Search (NAS) from d...

Please sign up or login with your details

Forgot password? Click here to reset