B2EA: An Evolutionary Algorithm Assisted by Two Bayesian Optimization Modules for Neural Architecture Search

02/07/2022
by   Hyunghun Cho, et al.
0

The early pioneering Neural Architecture Search (NAS) works were multi-trial methods applicable to any general search space. The subsequent works took advantage of the early findings and developed weight-sharing methods that assume a structured search space typically with pre-fixed hyperparameters. Despite the amazing computational efficiency of the weight-sharing NAS algorithms, it is becoming apparent that multi-trial NAS algorithms are also needed for identifying very high-performance architectures, especially when exploring a general search space. In this work, we carefully review the latest multi-trial NAS algorithms and identify the key strategies including Evolutionary Algorithm (EA), Bayesian Optimization (BO), diversification, input and output transformations, and lower fidelity estimation. To accommodate the key strategies into a single framework, we develop B2EA that is a surrogate assisted EA with two BO surrogate models and a mutation step in between. To show that B2EA is robust and efficient, we evaluate three performance metrics over 14 benchmarks with general and cell-based search spaces. Comparisons with state-of-the-art multi-trial algorithms reveal that B2EA is robust and efficient over the 14 benchmarks for three difficulty levels of target performance. The B2EA code is publicly available at <https://github.com/snu-adsl/BBEA>.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/11/2020

Differential Evolution for Neural Architecture Search

Neural architecture search (NAS) methods rely on a search strategy for d...
research
03/28/2020

NPENAS: Neural Predictor Guided Evolution for Neural Architecture Search

Neural architecture search (NAS) is a promising method for automatically...
research
04/27/2022

PRE-NAS: Predictor-assisted Evolutionary Neural Architecture Search

Neural architecture search (NAS) aims to automate architecture engineeri...
research
11/05/2021

NAS-Bench-x11 and the Power of Learning Curves

While early research in neural architecture search (NAS) required extrem...
research
02/11/2020

To Share or Not To Share: A Comprehensive Appraisal of Weight-Sharing

Weight-sharing (WS) has recently emerged as a paradigm to accelerate the...
research
06/12/2023

Rethink DARTS Search Space and Renovate a New Benchmark

DARTS search space (DSS) has become a canonical benchmark for NAS wherea...
research
12/05/2021

Exploring Complicated Search Spaces with Interleaving-Free Sampling

The existing neural architecture search algorithms are mostly working on...

Please sign up or login with your details

Forgot password? Click here to reset