Local Search is a Remarkably Strong Baseline for Neural Architecture Search

04/20/2020
by   T. Den Ottelander, et al.
0

Neural Architecture Search (NAS), i.e., the automation of neural network design, has gained much popularity in recent years with increasingly complex search algorithms being proposed. Yet, solid comparisons with simple baselines are often missing. At the same time, recent retrospective studies have found many new algorithms to be no better than random search (RS). In this work we consider, for the first time, a simple Local Search (LS) algorithm for NAS. We particularly consider a multi-objective NAS formulation, with network accuracy and network complexity as two objectives, as understanding the trade-off between these two objectives is arguably the most interesting aspect of NAS. The proposed LS algorithm is compared with RS and two evolutionary algorithms (EAs), as these are often heralded as being ideal for multi-objective optimization. To promote reproducibility, we create and release two benchmark datasets containing 200K saved network evaluations for two established image classification tasks, CIFAR-10 and CIFAR-100. Our benchmarks are designed to be complementary to existing benchmarks, especially in that they are better suited for multi-objective search. We additionally consider a version of the problem with a much larger architecture space. While we find and show that the considered algorithms explore the search space in fundamentally different ways, we also find that LS substantially outperforms RS and even performs nearly as good as state-of-the-art EAs. We believe that this provides strong evidence that LS is truly a competitive baseline for NAS against which new NAS algorithms should be benchmarked.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/06/2020

Local Search is State of the Art for NAS Benchmarks

Local search is one of the simplest families of algorithms in combinator...
research
07/18/2023

A Survey on Multi-Objective Neural Architecture Search

Recently, the expert-crafted neural architectures is increasing overtake...
research
08/22/2020

NAS-Bench-301 and the Case for Surrogate Benchmarks for Neural Architecture Search

Neural Architecture Search (NAS) is a logical next step in the automatic...
research
02/20/2019

Random Search and Reproducibility for Neural Architecture Search

Neural architecture search (NAS) is a promising research direction that ...
research
05/08/2023

MO-DEHB: Evolutionary-based Hyperband for Multi-Objective Optimization

Hyperparameter optimization (HPO) is a powerful technique for automating...
research
04/01/2022

Novelty Driven Evolutionary Neural Architecture Search

Evolutionary algorithms (EA) based neural architecture search (NAS) invo...
research
01/23/2020

Multi-objective Neural Architecture Search via Non-stationary Policy Gradient

Multi-objective Neural Architecture Search (NAS) aims to discover novel ...

Please sign up or login with your details

Forgot password? Click here to reset