Bag of Tricks for Neural Architecture Search

07/08/2021
by   Thomas Elsken, et al.
0

While neural architecture search methods have been successful in previous years and led to new state-of-the-art performance on various problems, they have also been criticized for being unstable, being highly sensitive with respect to their hyperparameters, and often not performing better than random search. To shed some light on this issue, we discuss some practical considerations that help improve the stability, efficiency and overall performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/24/2020

Disentangled Neural Architecture Search

Neural architecture search has shown its great potential in various area...
research
11/01/2017

Hierarchical Representations for Efficient Architecture Search

We explore efficient neural architecture search methods and present a si...
research
03/26/2021

Visionary: Vision architecture discovery for robot learning

We propose a vision-based architecture search algorithm for robot manipu...
research
04/08/2019

WeNet: Weighted Networks for Recurrent Network Architecture Search

In recent years, there has been increasing demand for automatic architec...
research
06/07/2021

MONCAE: Multi-Objective Neuroevolution of Convolutional Autoencoders

In this paper, we present a novel neuroevolutionary method to identify t...
research
11/19/2019

Hybrid Composition with IdleBlock: More Efficient Networks for Image Recognition

We propose a new building block, IdleBlock, which naturally prunes conne...
research
12/30/2019

Neural Architecture Search on Acoustic Scene Classification

Convolutional neural networks are widely adopted in Acoustic Scene Class...

Please sign up or login with your details

Forgot password? Click here to reset