Searching for A Robust Neural Architecture in Four GPU Hours

10/10/2019
by   Xuanyi Dong, et al.
18

Conventional neural architecture search (NAS) approaches are based on reinforcement learning or evolutionary strategy, which take more than 3000 GPU hours to find a good model on CIFAR-10. We propose an efficient NAS approach learning to search by gradient descent. Our approach represents the search space as a directed acyclic graph (DAG). This DAG contains billions of sub-graphs, each of which indicates a kind of neural architecture. To avoid traversing all the possibilities of the sub-graphs, we develop a differentiable sampler over the DAG. This sampler is learnable and optimized by the validation loss after training the sampled architecture. In this way, our approach can be trained in an end-to-end fashion by gradient descent, named Gradient-based search using Differentiable Architecture Sampler (GDAS). In experiments, we can finish one searching procedure in four GPU hours on CIFAR-10, and the discovered model obtains a test error of 2.82% with only 2.5M parameters, which is on par with the state-of-the-art. Code is publicly available on GitHub: https://github.com/D-X-Y/NAS-Projects.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/24/2018

SNAS: Stochastic Neural Architecture Search

We propose Stochastic Neural Architecture Search (SNAS), an economical e...
research
05/18/2019

Multinomial Distribution Learning for Effective Neural Architecture Search

Architectures obtained by Neural Architecture Search (NAS) have achieved...
research
11/27/2020

Multi-objective Neural Architecture Search with Almost No Training

In the recent past, neural architecture search (NAS) has attracted incre...
research
01/31/2023

NASiam: Efficient Representation Learning using Neural Architecture Search for Siamese Networks

Siamese networks are one of the most trending methods to achieve self-su...
research
03/12/2021

Searching by Generating: Flexible and Efficient One-Shot NAS with Architecture Generator

In one-shot NAS, sub-networks need to be searched from the supernet to m...
research
06/26/2020

Traditional and accelerated gradient descent for neural architecture search

In this paper, we introduce two algorithms for neural architecture searc...
research
04/05/2019

Single-Path NAS: Designing Hardware-Efficient ConvNets in less than 4 Hours

Can we automatically design a Convolutional Network (ConvNet) with the h...

Please sign up or login with your details

Forgot password? Click here to reset