NAS-Bench-360: Benchmarking Diverse Tasks for Neural Architecture Search

10/12/2021
by   Renbo Tu, et al.
0

Most existing neural architecture search (NAS) benchmarks and algorithms prioritize performance on well-studied tasks, e.g., image classification on CIFAR and ImageNet. This makes the applicability of NAS approaches in more diverse areas inadequately understood. In this paper, we present NAS-Bench-360, a benchmark suite for evaluating state-of-the-art NAS methods for convolutional neural networks (CNNs). To construct it, we curate a collection of ten tasks spanning a diverse array of application domains, dataset sizes, problem dimensionalities, and learning objectives. By carefully selecting tasks that can both interoperate with modern CNN-based search methods but that are also far-afield from their original development domain, we can use NAS-Bench-360 to investigate the following central question: do existing state-of-the-art NAS methods perform well on diverse tasks? Our experiments show that a modern NAS procedure designed for image classification can indeed find good architectures for tasks with other dimensionalities and learning objectives; however, the same method struggles against more task-specific methods and performs catastrophically poorly on classification in non-vision domains. The case for NAS robustness becomes even more dire in a resource-constrained setting, where a recent NAS method provides little-to-no benefit over much simpler baselines. These results demonstrate the need for a benchmark such as NAS-Bench-360 to help develop NAS approaches that work well on a variety of tasks, a crucial component of a truly robust and automated pipeline. We conclude with a demonstration of the kind of future research our suite of tasks will enable. All data and code is made publicly available.

READ FULL TEXT

page 1

page 2

page 3

page 4

04/15/2022

Efficient Architecture Search for Diverse Tasks

While neural architecture search (NAS) has enabled automated machine lea...
02/25/2019

NAS-Bench-101: Towards Reproducible Neural Architecture Search

Recent advances in neural architecture search (NAS) demand tremendous co...
07/24/2020

What and Where: Learn to Plug Adapters via NAS for Multi-Domain Learning

As an important and challenging problem, multi-domain learning (MDL) typ...
05/12/2020

Neural Architecture Transfer

Neural architecture search (NAS) has emerged as a promising avenue for a...
01/31/2022

NAS-Bench-Suite: NAS Evaluation is (Now) Surprisingly Easy

The release of tabular benchmarks, such as NAS-Bench-101 and NAS-Bench-2...
08/03/2021

Elastic Architecture Search for Diverse Tasks with Different Resources

We study a new challenging problem of efficient deployment for diverse t...
04/20/2020

Local Search is a Remarkably Strong Baseline for Neural Architecture Search

Neural Architecture Search (NAS), i.e., the automation of neural network...