NAS-Bench-360: Benchmarking Diverse Tasks for Neural Architecture Search

by   Renbo Tu, et al.

Most existing neural architecture search (NAS) benchmarks and algorithms prioritize performance on well-studied tasks, e.g., image classification on CIFAR and ImageNet. This makes the applicability of NAS approaches in more diverse areas inadequately understood. In this paper, we present NAS-Bench-360, a benchmark suite for evaluating state-of-the-art NAS methods for convolutional neural networks (CNNs). To construct it, we curate a collection of ten tasks spanning a diverse array of application domains, dataset sizes, problem dimensionalities, and learning objectives. By carefully selecting tasks that can both interoperate with modern CNN-based search methods but that are also far-afield from their original development domain, we can use NAS-Bench-360 to investigate the following central question: do existing state-of-the-art NAS methods perform well on diverse tasks? Our experiments show that a modern NAS procedure designed for image classification can indeed find good architectures for tasks with other dimensionalities and learning objectives; however, the same method struggles against more task-specific methods and performs catastrophically poorly on classification in non-vision domains. The case for NAS robustness becomes even more dire in a resource-constrained setting, where a recent NAS method provides little-to-no benefit over much simpler baselines. These results demonstrate the need for a benchmark such as NAS-Bench-360 to help develop NAS approaches that work well on a variety of tasks, a crucial component of a truly robust and automated pipeline. We conclude with a demonstration of the kind of future research our suite of tasks will enable. All data and code is made publicly available.


page 1

page 2

page 3

page 4


Efficient Architecture Search for Diverse Tasks

While neural architecture search (NAS) has enabled automated machine lea...

NAS-Bench-101: Towards Reproducible Neural Architecture Search

Recent advances in neural architecture search (NAS) demand tremendous co...

Neural Architecture Transfer

Neural architecture search (NAS) has emerged as a promising avenue for a...

What and Where: Learn to Plug Adapters via NAS for Multi-Domain Learning

As an important and challenging problem, multi-domain learning (MDL) typ...

Colab NAS: Obtaining lightweight task-specific convolutional neural networks following Occam's razor

The current trend of applying transfer learning from CNNs trained on lar...

LLMatic: Neural Architecture Search via Large Language Models and Quality-Diversity Optimization

Large Language Models (LLMs) have emerged as powerful tools capable of a...

NAS-Bench-Suite-Zero: Accelerating Research on Zero Cost Proxies

Zero-cost proxies (ZC proxies) are a recent architecture performance pre...

Please sign up or login with your details

Forgot password? Click here to reset