Traditional and accelerated gradient descent for neural architecture search

06/26/2020
by   Nicolas Garcia Trillos, et al.
0

In this paper, we introduce two algorithms for neural architecture search (NASGD and NASAGD) following the theoretical work by two of the authors [4], which aimed at introducing the conceptual basis for new notions of traditional and accelerated gradient descent algorithms for the optimization of a function on a semi-discrete space using ideas from optimal transport theory. Our methods, which use the network morphism framework introduced in [3] as a baseline, can analyze forty times as many architectures as the hill climbing methods [3,11] while using the same computational resources and time and achieving comparable levels of accuracy. For example, using NASGD on CIFAR-10, our method designs and trains networks with an error rate of 4.06 in only 12 hours on a single GPU.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/10/2019

EENA: Efficient Evolution of Neural Architecture

Latest algorithms for automatic neural architecture search perform remar...
research
06/26/2020

Semi-discrete optimization through semi-discrete optimal transport: a framework for neural architecture search

In this paper we introduce a theoretical framework for semi-discrete opt...
research
11/13/2017

Simple And Efficient Architecture Search for Convolutional Neural Networks

Neural networks have recently had a lot of success for many tasks. Howev...
research
10/10/2019

Searching for A Robust Neural Architecture in Four GPU Hours

Conventional neural architecture search (NAS) approaches are based on re...
research
11/27/2017

Accelerated Optimization in the PDE Framework: Formulations for the Active Contour Case

Following the seminal work of Nesterov, accelerated optimization methods...
research
05/05/2009

Feasibility of random basis function approximators for modeling and control

We discuss the role of random basis function approximators in modeling a...
research
01/30/2022

Augmenting Novelty Search with a Surrogate Model to Engineer Meta-Diversity in Ensembles of Classifiers

Using Neuroevolution combined with Novelty Search to promote behavioural...

Please sign up or login with your details

Forgot password? Click here to reset