Rapid Neural Architecture Search by Learning to Generate Graphs from Datasets

07/02/2021
by   Hayeon Lee, et al.
0

Despite the success of recent Neural Architecture Search (NAS) methods on various tasks which have shown to output networks that largely outperform human-designed networks, conventional NAS methods have mostly tackled the optimization of searching for the network architecture for a single task (dataset), which does not generalize well across multiple tasks (datasets). Moreover, since such task-specific methods search for a neural architecture from scratch for every given task, they incur a large computational cost, which is problematic when the time and monetary budget are limited. In this paper, we propose an efficient NAS framework that is trained once on a database consisting of datasets and pretrained networks and can rapidly search for a neural architecture for a novel dataset. The proposed MetaD2A (Meta Dataset-to-Architecture) model can stochastically generate graphs (architectures) from a given set (dataset) via a cross-modal latent space learned with amortized meta-learning. Moreover, we also propose a meta-performance predictor to estimate and select the best architecture without direct training on target datasets. The experimental results demonstrate that our model meta-learned on subsets of ImageNet-1K and architectures from NAS-Bench 201 search space successfully generalizes to multiple unseen datasets including CIFAR-10 and CIFAR-100, with an average search time of 33 GPU seconds. Even under MobileNetV3 search space, MetaD2A is 5.5K times faster than NSGANetV2, a transferable NAS method, with comparable performance. We believe that the MetaD2A proposes a new research direction for rapid NAS as well as ways to utilize the knowledge from rich databases of datasets and architectures accumulated over the past years. Code is available at https://github.com/HayeonLee/MetaD2A.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

10/12/2021

Across-Task Neural Architecture Search via Meta Learning

Adequate labeled data and expensive compute resources are the prerequisi...
03/03/2020

BATS: Binary ArchitecTure Search

This paper proposes Binary ArchitecTure Search (BATS), a framework that ...
03/02/2021

Task-Adaptive Neural Network Retrieval with Meta-Contrastive Learning

Most conventional Neural Architecture Search (NAS) approaches are limite...
11/09/2019

Learning to reinforcement learn for Neural Architecture Search

Reinforcement learning (RL) is a goal-oriented learning solution that ha...
03/31/2020

MTL-NAS: Task-Agnostic Neural Architecture Search towards General-Purpose Multi-Task Learning

We propose to incorporate neural architecture search (NAS) into general-...
03/23/2021

BossNAS: Exploring Hybrid CNN-transformers with Block-wisely Self-supervised Neural Architecture Search

A myriad of recent breakthroughs in hand-crafted neural architectures fo...
08/28/2020

NATS-Bench: Benchmarking NAS algorithms for Architecture Topology and Size

Neural architecture search (NAS) has attracted a lot of attention and ha...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.