Mutation is all you need

07/04/2021
by   Lennart Schneider, et al.
0

Neural architecture search (NAS) promises to make deep learning accessible to non-experts by automating architecture engineering of deep neural networks. BANANAS is one state-of-the-art NAS method that is embedded within the Bayesian optimization framework. Recent experimental findings have demonstrated the strong performance of BANANAS on the NAS-Bench-101 benchmark being determined by its path encoding and not its choice of surrogate model. We present experimental results suggesting that the performance of BANANAS on the NAS-Bench-301 benchmark is determined by its acquisition function optimizer, which minimally mutates the incumbent.

READ FULL TEXT
research
01/27/2023

BOMP-NAS: Bayesian Optimization Mixed Precision NAS

Bayesian Optimization Mixed-Precision Neural Architecture Search (BOMP-N...
research
06/03/2022

A Survey on Surrogate-assisted Efficient Neural Architecture Search

Neural architecture search (NAS) has become increasingly popular in the ...
research
08/22/2020

NAS-Bench-301 and the Case for Surrogate Benchmarks for Neural Architecture Search

Neural Architecture Search (NAS) is a logical next step in the automatic...
research
11/01/2020

Neural Network Design: Learning from Neural Architecture Search

Neural Architecture Search (NAS) aims to optimize deep neural networks' ...
research
09/06/2019

Distributed creation of Machine learning agents for Blockchain analysis

Creating efficient deep neural networks involves repetitive manual optim...
research
12/06/2021

Manas: Mining Software Repositories to Assist AutoML

Today deep learning is widely used for building software. A software eng...
research
10/25/2022

Shortest Edit Path Crossover: A Theory-driven Solution to the Permutation Problem in Evolutionary Neural Architecture Search

Evolutionary algorithms (EAs) have gained attention recently due to thei...

Please sign up or login with your details

Forgot password? Click here to reset