Neural Architecture Optimization

08/22/2018
by   Renqian Luo, et al.
0

Automatic neural architecture design has shown its potential in discovering powerful neural network architectures. Existing methods, no matter based on reinforcement learning or evolutionary algorithms (EA), conduct architecture search in a discrete space, which is highly inefficient. In this paper, we propose a simple and efficient method to automatic neural architecture design based on continuous optimization. We call this new approach neural architecture optimization (NAO). There are three key components in our proposed approach: (1) An encoder embeds/maps neural network architectures into a continuous space. (2) A predictor takes the continuous representation of a network as input and predicts its accuracy. (3) A decoder maps a continuous representation of a network back to its architecture. The performance predictor and the encoder enable us to perform gradient based optimization in the continuous space to find the embedding of a new architecture with potentially better accuracy. Such a better embedding is then decoded to a network by the decoder. Experiments show that the architecture discovered by our method is very competitive for image classification task on CIFAR-10 and language modeling task on PTB, outperforming or on par with the best results of previous architecture search methods with a significantly reduction of computational resources. Specifically we obtain 2.07% test set error rate for CIFAR-10 image classification task and 55.9 test set perplexity of PTB language modeling task. The best discovered architectures on both tasks are successfully transferred to other tasks such as CIFAR-100 and WikiText-2.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/10/2019

EENA: Efficient Evolution of Neural Architecture

Latest algorithms for automatic neural architecture search perform remar...
research
06/18/2020

Neural Architecture Optimization with Graph VAE

Due to their high computational efficiency on a continuous space, gradie...
research
06/16/2021

TSO: Curriculum Generation using continuous optimization

The training of deep learning models poses vast challenges of including ...
research
06/13/2022

EmProx: Neural Network Performance Estimation For Neural Architecture Search

Common Neural Architecture Search methods generate large amounts of cand...
research
08/18/2021

Analyze and Design Network Architectures by Recursion Formulas

The effectiveness of shortcut/skip-connection has been widely verified, ...
research
11/05/2016

Neural Architecture Search with Reinforcement Learning

Neural networks are powerful and flexible models that work well for many...
research
04/08/2019

Resource Constrained Neural Network Architecture Search

The design of neural network architectures is frequently either based on...

Please sign up or login with your details

Forgot password? Click here to reset