Evolving Deep Neural Networks

03/01/2017
by   Risto Miikkulainen, et al.
0

The success of deep learning depends on finding an architecture to fit the task. As deep learning has scaled up to more challenging tasks, the architectures have become difficult to design by hand. This paper proposes an automated method, CoDeepNEAT, for optimizing deep learning architectures through evolution. By extending existing neuroevolution methods to topology, components, and hyperparameters, this method achieves results comparable to best human designs in standard benchmarks in object recognition and language modeling. It also supports building a real-world application of automated image captioning on a magazine website. Given the anticipated increases in available computing power, evolution of deep networks is promising approach to constructing deep learning applications in the future.

READ FULL TEXT
research
08/08/2023

Asynchronous Evolution of Deep Neural Network Architectures

Many evolutionary algorithms (EAs) take advantage of parallel evaluation...
research
01/17/2018

Image Captioning using Deep Neural Architectures

Automatically creating the description of an image using any natural lan...
research
02/18/2019

Evolutionary Neural AutoML for Deep Learning

Deep neural networks (DNNs) have produced state-of-the-art results in ma...
research
05/23/2018

Learning Illuminant Estimation from Object Recognition

In this paper we present a deep learning method to estimate the illumina...
research
07/10/2020

The Computational Limits of Deep Learning

Deep learning's recent history has been one of achievement: from triumph...
research
06/04/2016

Automated Image Captioning for Rapid Prototyping and Resource Constrained Environments

Significant performance gains in deep learning coupled with the exponent...
research
03/09/2021

Knowledge Evolution in Neural Networks

Deep learning relies on the availability of a large corpus of data (labe...

Please sign up or login with your details

Forgot password? Click here to reset