AutoShrink: A Topology-aware NAS for Discovering Efficient Neural Architecture

11/21/2019
by   Tunhou Zhang, et al.
0

Resource is an important constraint when deploying Deep Neural Networks (DNNs) on mobile and edge devices. Existing works commonly adopt the cell-based search approach, which limits the flexibility of network patterns in learned cell structures. Moreover, due to the topology-agnostic nature of existing works, including both cell-based and node-based approaches, the search process is time consuming and the performance of found architecture may be sub-optimal. To address these problems, we propose AutoShrink, a topology-aware Neural Architecture Search(NAS) for searching efficient building blocks of neural architectures. Our method is node-based and thus can learn flexible network patterns in cell structures within a topological search space. Directed Acyclic Graphs (DAGs) are used to abstract DNN architectures and progressively optimize the cell structure through edge shrinking. As the search space intrinsically reduces as the edges are progressively shrunk, AutoShrink explores more flexible search space with even less search time. We evaluate AutoShrink on image classification and language tasks by crafting ShrinkCNN and ShrinkRNN models. ShrinkCNN is able to achieve up to 48 Multiply-Accumulates (MACs) on ImageNet-1K with comparable accuracy of state-of-the-art (SOTA) models. Specifically, both ShrinkCNN and ShrinkRNN are crafted within 1.5 GPU hours, which is 7.2x and 6.7x faster than the crafting time of SOTA CNN and RNN models, respectively.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/23/2022

Mixed-Block Neural Architecture Search for Medical Image Segmentation

Deep Neural Networks (DNNs) have the potential for making various clinic...
research
12/24/2020

Memory-Efficient Hierarchical Neural Architecture Search for Image Restoration

Recently, much attention has been spent on neural architecture search (N...
research
06/07/2019

AutoGrow: Automatic Layer Growing in Deep Convolutional Networks

We propose AutoGrow to automate depth discovery in Deep Neural Networks ...
research
06/19/2019

SwiftNet: Using Graph Propagation as Meta-knowledge to Search Highly Representative Neural Architectures

Designing neural architectures for edge devices is subject to constraint...
research
10/02/2020

DOTS: Decoupling Operation and Topology in Differentiable Architecture Search

Differentiable Architecture Search (DARTS) has attracted extensive atten...
research
11/30/2020

ScaleNAS: One-Shot Learning of Scale-Aware Representations for Visual Recognition

Scale variance among different sizes of body parts and objects is a chal...
research
06/03/2019

Discovering Neural Wirings

The success of neural networks has driven a shift in focus from feature ...

Please sign up or login with your details

Forgot password? Click here to reset