Searching by Generating: Flexible and Efficient One-Shot NAS with Architecture Generator

03/12/2021
by   Sian-Yao Huang, et al.
0

In one-shot NAS, sub-networks need to be searched from the supernet to meet different hardware constraints. However, the search cost is high and N times of searches are needed for N different constraints. In this work, we propose a novel search strategy called architecture generator to search sub-networks by generating them, so that the search process can be much more efficient and flexible. With the trained architecture generator, given target hardware constraints as the input, N good architectures can be generated for N constraints by just one forward pass without re-searching and supernet retraining. Moreover, we propose a novel single-path supernet, called unified supernet, to further improve search efficiency and reduce GPU memory consumption of the architecture generator. With the architecture generator and the unified supernet, we propose a flexible and efficient one-shot NAS framework, called Searching by Generating NAS (SGNAS). With the pre-trained supernt, the search time of SGNAS for N different hardware constraints is only 5 GPU hours, which is 4N times faster than previous SOTA single-path methods. After training from scratch, the top1-accuracy of SGNAS on ImageNet is 77.1 https://github.com/eric8607242/SGNAS.

READ FULL TEXT

page 12

page 13

research
04/05/2019

Single-Path NAS: Designing Hardware-Efficient ConvNets in less than 4 Hours

Can we automatically design a Convolutional Network (ConvNet) with the h...
research
07/01/2019

Single-Path Mobile AutoML: Efficient ConvNet Design and NAS Hyperparameter Optimization

Can we reduce the search cost of Neural Architecture Search (NAS) from d...
research
10/10/2019

Searching for A Robust Neural Architecture in Four GPU Hours

Conventional neural architecture search (NAS) approaches are based on re...
research
08/26/2019

Once for All: Train One Network and Specialize it for Efficient Deployment

Efficient deployment of deep learning models requires specialized neural...
research
01/03/2022

Vision Transformer Slimming: Multi-Dimension Searching in Continuous Optimization Space

This paper explores the feasibility of finding an optimal sub-model from...
research
03/11/2020

PONAS: Progressive One-shot Neural Architecture Search for Very Efficient Deployment

We achieve very efficient deep learning model deployment that designs ne...
research
07/08/2022

SuperTickets: Drawing Task-Agnostic Lottery Tickets from Supernets via Jointly Architecture Searching and Parameter Pruning

Neural architecture search (NAS) has demonstrated amazing success in sea...

Please sign up or login with your details

Forgot password? Click here to reset