NAS-Bench-102: Extending the Scope of Reproducible Neural Architecture Search

01/02/2020
by   Xuanyi Dong, et al.
0

Neural architecture search (NAS) has achieved breakthrough success in a great number of applications in the past few years. It could be time to take a step back and analyze the good and bad aspects in the field of NAS. A variety of algorithms search architectures under different search space. These searched architectures are trained using different setups, e.g., hyper-parameters, data augmentation, regularization. This raises a comparability problem when comparing the performance of various NAS algorithms. NAS-Bench-101 has shown success to alleviate this problem. In this work, we propose an extension to NAS-Bench-101: NAS-Bench-102 with a different search space, results on multiple datasets, and more diagnostic information. NAS-Bench-102 has a fixed search space and provides a unified benchmark for almost any up-to-date NAS algorithms. The design of our search space is inspired from the one used in the most popular cell-based searching algorithms, where a cell is represented as a DAG. Each edge here is associated with an operation selected from a predefined operation set. For it to be applicable for all NAS algorithms, the search space defined in NAS-Bench-102 includes all possible architectures generated by 4 nodes and 5 associated operation options, which results in 15,625 candidates in total. The training log and the performance for each architecture candidate are provided for three datasets. This allows researchers to avoid unnecessary repetitive training for selected candidate and focus solely on the search algorithm itself. The training time saved for every candidate also largely improves the efficiency of many methods. We provide additional diagnostic information such as fine-grained loss and accuracy, which can give inspirations to new designs of NAS algorithms. In further support, we have analyzed it from many aspects and benchmarked 10 recent NAS algorithms.

READ FULL TEXT
research
08/28/2020

NATS-Bench: Benchmarking NAS algorithms for Architecture Topology and Size

Neural architecture search (NAS) has attracted a lot of attention and ha...
research
12/27/2018

Neural Architecture Search Over a Graph Search Space

Neural architecture search (NAS) enabled the discovery of state-of-the-a...
research
03/07/2021

Efficient Model Performance Estimation via Feature Histories

An important step in the task of neural network design, such as hyper-pa...
research
11/17/2019

Neural Recurrent Structure Search for Knowledge Graph Embedding

Knowledge graph (KG) embedding is a fundamental problem in mining relati...
research
11/21/2020

Continuous Ant-Based Neural Topology Search

This work introduces a novel, nature-inspired neural architecture search...
research
09/14/2022

NAAP-440 Dataset and Baseline for Neural Architecture Accuracy Prediction

Neural architecture search (NAS) has become a common approach to develop...
research
06/18/2019

Prune and Replace NAS

While recent NAS algorithms are thousands of times faster than the pione...

Please sign up or login with your details

Forgot password? Click here to reset