Densely Connected Search Space for More Flexible Neural Architecture Search

06/23/2019
by   Jiemin Fang, et al.
6

In recent years, neural architecture search (NAS) has dramatically advanced the development of neural network design. While most previous works are computationally intensive, differentiable NAS methods reduce the search cost by constructing a super network in a continuous space covering all possible architectures to search for. However, few of them can search for the network width (the number of filters/channels) because it is intractable to integrate architectures with different widths into one super network following conventional differentiable NAS paradigm. In this paper, we propose a novel differentiable NAS method which can search for the width and the spatial resolution of each block simultaneously. We achieve this by constructing a densely connected search space and name our method as DenseNAS. Blocks with different width and spatial resolution combinations are densely connected to each other. The best path in the super network is selected by optimizing the transition probabilities between blocks. As a result, the overall depth distribution of the network is optimized globally in a graceful manner. In the experiments, DenseNAS obtains an architecture with 75.9 ImageNet and the latency is as low as 24.3ms on a single TITAN-XP. The total search time is merely 23 hours on 4 GPUs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/12/2020

TF-NAS: Rethinking Three Search Freedoms of Latency-Constrained Differentiable Neural Architecture Search

With the flourish of differentiable neural architecture search (NAS), au...
research
03/31/2021

NetAdaptV2: Efficient Neural Architecture Search with Fast Super-Network Training and Architecture Optimization

Neural architecture search (NAS) typically consists of three main steps:...
research
03/30/2021

Differentiable Network Adaption with Elastic Search Space

In this paper we propose a novel network adaption method called Differen...
research
01/15/2022

UDC: Unified DNAS for Compressible TinyML Models

Emerging Internet-of-things (IoT) applications are driving deployment of...
research
04/06/2021

Searching Efficient Model-guided Deep Network for Image Denoising

Neural architecture search (NAS) has recently reshaped our understanding...
research
06/04/2021

Event Classification with Multi-step Machine Learning

The usefulness and value of Multi-step Machine Learning (ML), where a ta...
research
01/17/2021

Trilevel Neural Architecture Search for Efficient Single Image Super-Resolution

This paper proposes a trilevel neural architecture search (NAS) method f...

Please sign up or login with your details

Forgot password? Click here to reset