Are Neural Architecture Search Benchmarks Well Designed? A Deeper Look Into Operation Importance

03/29/2023
by   Vasco Lopes, et al.
0

Neural Architecture Search (NAS) benchmarks significantly improved the capability of developing and comparing NAS methods while at the same time drastically reduced the computational overhead by providing meta-information about thousands of trained neural networks. However, tabular benchmarks have several drawbacks that can hinder fair comparisons and provide unreliable results. These usually focus on providing a small pool of operations in heavily constrained search spaces – usually cell-based neural networks with pre-defined outer-skeletons. In this work, we conducted an empirical analysis of the widely used NAS-Bench-101, NAS-Bench-201 and TransNAS-Bench-101 benchmarks in terms of their generability and how different operations influence the performance of the generated architectures. We found that only a subset of the operation pool is required to generate architectures close to the upper-bound of the performance range. Also, the performance distribution is negatively skewed, having a higher density of architectures in the upper-bound range. We consistently found convolution layers to have the highest impact on the architecture's performance, and that specific combination of operations favors top-scoring architectures. These findings shed insights on the correct evaluation and comparison of NAS methods using NAS benchmarks, showing that directly searching on NAS-Bench-201, ImageNet16-120 and TransNAS-Bench-101 produces more reliable results than searching only on CIFAR-10. Furthermore, with this work we provide suggestions for future benchmark evaluations and design. The code used to conduct the evaluations is available at https://github.com/VascoLopes/NAS-Benchmark-Evaluation.

READ FULL TEXT

page 1

page 6

page 8

page 9

page 10

page 13

research
12/11/2020

Differential Evolution for Neural Architecture Search

Neural architecture search (NAS) methods rely on a search strategy for d...
research
03/16/2022

On Redundancy and Diversity in Cell-based Neural Architecture Search

Searching for the architecture cells is a dominant paradigm in NAS. Howe...
research
01/31/2022

NAS-Bench-Suite: NAS Evaluation is (Now) Surprisingly Easy

The release of tabular benchmarks, such as NAS-Bench-101 and NAS-Bench-2...
research
08/09/2021

BenchENAS: A Benchmarking Platform for Evolutionary Neural Architecture Search

Neural architecture search (NAS), which automatically designs the archit...
research
11/08/2021

Approximate Neural Architecture Search via Operation Distribution Learning

The standard paradigm in Neural Architecture Search (NAS) is to search f...
research
10/23/2021

Towards a Robust Differentiable Architecture Search under Label Noise

Neural Architecture Search (NAS) is the game changer in designing robust...
research
05/25/2021

The Nonlinearity Coefficient - A Practical Guide to Neural Architecture Design

In essence, a neural network is an arbitrary differentiable, parametrize...

Please sign up or login with your details

Forgot password? Click here to reset