Progressive Neural Architecture Search

12/02/2017
by   Chenxi Liu, et al.
0

We propose a method for learning CNN structures that is more efficient than previous approaches: instead of using reinforcement learning (RL) or genetic algorithms (GA), we use a sequential model-based optimization (SMBO) strategy, in which we search for architectures in order of increasing complexity, while simultaneously learning a surrogate function to guide the search, similar to A* search. On the CIFAR-10 dataset, our method finds a CNN structure with the same classification accuracy (3.41 (2017), but 2 times faster (in terms of number of models evaluated). It also outperforms the GA method of Liu et al. (2017), which finds a model with worse performance (3.63 the model we learned on CIFAR also works well at the task of ImageNet classification. In particular, we match the state-of-the-art performance of 82.9

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/01/2018

Efficient Progressive Neural Architecture Search

This paper addresses the difficult problem of finding an optimal neural ...
research
10/12/2018

Graph HyperNetworks for Neural Architecture Search

Neural architecture search (NAS) automatically finds the best task-speci...
research
06/13/2022

EmProx: Neural Network Performance Estimation For Neural Architecture Search

Common Neural Architecture Search methods generate large amounts of cand...
research
01/30/2022

Augmenting Novelty Search with a Surrogate Model to Engineer Meta-Diversity in Ensembles of Classifiers

Using Neuroevolution combined with Novelty Search to promote behavioural...
research
09/28/2021

Turning old models fashion again: Recycling classical CNN networks using the Lattice Transformation

In the early 1990s, the first signs of life of the CNN era were given: L...
research
05/27/2020

Evolutionary NAS with Gene Expression Programming of Cellular Encoding

The renaissance of neural architecture search (NAS) has seen classical m...

Please sign up or login with your details

Forgot password? Click here to reset