Heed the Noise in Performance Evaluations in Neural Architecture Search

02/04/2022
by   Arkadiy Dushatskiy, et al.
0

Neural Architecture Search (NAS) has recently become a topic of great interest. However, there is a potentially impactful issue within NAS that remains largely unrecognized: noise. Due to stochastic factors in neural network initialization, training, and the chosen train/validation dataset split, the performance evaluation of a neural network architecture, which is often based on a single learning run, is also stochastic. This may have a particularly large impact if a dataset is small. We therefore propose to reduce the noise by having architecture evaluations comprise averaging of scores over multiple network training runs using different random seeds and cross-validation. We perform experiments for a combinatorial optimization formulation of NAS in which we vary noise reduction levels. We use the same computational budget for each noise level in terms of network training runs, i.e., we allow less architecture evaluations when averaging over more training runs. Multiple search algorithms are considered, including evolutionary algorithms which generally perform well for NAS. We use two publicly available datasets from the medical image segmentation domain where datasets are often limited and variability among samples is often high. Our results show that reducing noise in architecture evaluations enables finding better architectures by all considered search algorithms.

READ FULL TEXT

page 4

page 9

page 10

research
12/11/2020

Differential Evolution for Neural Architecture Search

Neural architecture search (NAS) methods rely on a search strategy for d...
research
10/25/2019

BANANAS: Bayesian Optimization with Neural Architectures for Neural Architecture Search

Neural Architecture Search (NAS) has seen an explosion of research in th...
research
08/22/2020

NAS-Bench-301 and the Case for Surrogate Benchmarks for Neural Architecture Search

Neural Architecture Search (NAS) is a logical next step in the automatic...
research
11/05/2021

A Data-driven Approach to Neural Architecture Search Initialization

Algorithmic design in neural architecture search (NAS) has received a lo...
research
10/23/2021

Towards a Robust Differentiable Architecture Search under Label Noise

Neural Architecture Search (NAS) is the game changer in designing robust...
research
02/25/2023

DCLP: Neural Architecture Predictor with Curriculum Contrastive Learning

Neural predictors currently show great potential in the performance eval...
research
04/17/2020

Organ at Risk Segmentation for Head and Neck Cancer using Stratified Learning and Neural Architecture Search

OAR segmentation is a critical step in radiotherapy of head and neck (H ...

Please sign up or login with your details

Forgot password? Click here to reset