EAT-NAS: Elastic Architecture Transfer for Accelerating Large-scale Neural Architecture Search

01/17/2019
by   Jiemin Fang, et al.
8

Neural architecture search (NAS) methods have been proposed to release human experts from tedious architecture engineering. However, most current methods are constrained in small-scale search due to the issue of computational resources. Meanwhile, directly applying architectures searched on small datasets to large-scale tasks often bears no performance guarantee. This limitation impedes the wide use of NAS on large-scale tasks. To overcome this obstacle, we propose an elastic architecture transfer mechanism for accelerating large-scale neural architecture search (EAT-NAS). In our implementations, architectures are first searched on a small dataset (the width and depth of architectures are taken into consideration as well), e.g., CIFAR-10, and the best is chosen as the basic architecture. Then the whole architecture is transferred with elasticity. We accelerate the search process on a large-scale dataset, e.g., the whole ImageNet dataset, with the help of the basic architecture. What we propose is not only a NAS method but a mechanism for architecture-level transfer. In our experiments, we obtain two final models EATNet-A and EATNet-B that achieve competitive accuracies, 73.8 which also surpass the models searched from scratch on ImageNet under the same settings. For computational cost, EAT-NAS takes only less than 5 days on 8 TITAN X GPUs, which is significantly less than the computational consumption of the state-of-the-art large-scale NAS methods.

READ FULL TEXT
research
06/23/2020

NASTransfer: Analyzing Architecture Transferability in Large Scale Neural Architecture Search

Neural Architecture Search (NAS) is an open and challenging problem in m...
research
02/25/2019

NAS-Bench-101: Towards Reproducible Neural Architecture Search

Recent advances in neural architecture search (NAS) demand tremendous co...
research
04/23/2020

Depth-Wise Neural Architecture Search

Modern convolutional networks such as ResNet and NASNet have achieved st...
research
06/17/2022

FreeREA: Training-Free Evolution-based Architecture Search

In the last decade, most research in Machine Learning contributed to the...
research
10/26/2022

PredNAS: A Universal and Sample Efficient Neural Architecture Search Framework

In this paper, we present a general and effective framework for Neural A...
research
12/17/2019

Generative Teaching Networks: Accelerating Neural Architecture Search by Learning to Generate Synthetic Training Data

This paper investigates the intriguing question of whether we can create...
research
07/14/2022

PASHA: Efficient HPO with Progressive Resource Allocation

Hyperparameter optimization (HPO) and neural architecture search (NAS) a...

Please sign up or login with your details

Forgot password? Click here to reset