Random Search and Reproducibility for Neural Architecture Search

02/20/2019
by   Liam Li, et al.
14

Neural architecture search (NAS) is a promising research direction that has the potential to replace expert-designed networks with learned, task-specific architectures. In this work, in order to help ground the empirical results in this field, we propose new NAS baselines that build off the following observations: (i) NAS is a specialized hyperparameter optimization problem; and (ii) random search is a competitive baseline for hyperparameter optimization. Leveraging these observations, we evaluate both random search with early-stopping and a novel random search with weight-sharing algorithm on two standard NAS benchmarks---PTB and CIFAR-10. Our results show that random search with early-stopping is a competitive NAS baseline, e.g., it performs at least as well as ENAS, a leading NAS method, on both benchmarks. Additionally, random search with weight-sharing outperforms random search with early-stopping, achieving a state-of-the-art NAS result on PTB and a highly competitive result on CIFAR-10. Finally, we explore the existing reproducibility issues of published NAS results. We note the lack of source material needed to exactly reproduce these results, and further discuss the robustness of published results given the various sources of variability in NAS experimental setups. Relatedly, we provide all information (code, random seeds, documentation) needed to exactly reproduce our results, and report our random search with weight-sharing results for each benchmark on two independent experimental runs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/21/2019

Evaluating the Search Phase of Neural Architecture Search

Neural Architecture Search (NAS) aims to facilitate the design of deep n...
research
07/28/2023

Shrink-Perturb Improves Architecture Mixing during Population Based Training for Neural Architecture Search

In this work, we show that simultaneously training and mixing neural net...
research
04/20/2020

Local Search is a Remarkably Strong Baseline for Neural Architecture Search

Neural Architecture Search (NAS), i.e., the automation of neural network...
research
01/28/2020

NAS-Bench-1Shot1: Benchmarking and Dissecting One-shot Neural Architecture Search

One-shot neural architecture search (NAS) has played a crucial role in m...
research
07/11/2022

Long-term Reproducibility for Neural Architecture Search

It is a sad reflection of modern academia that code is often ignored aft...
research
11/05/2021

NAS-Bench-x11 and the Power of Learning Curves

While early research in neural architecture search (NAS) required extrem...
research
03/09/2020

How to Train Your Super-Net: An Analysis of Training Heuristics in Weight-Sharing NAS

Weight sharing promises to make neural architecture search (NAS) tractab...

Please sign up or login with your details

Forgot password? Click here to reset