Correspondence between neuroevolution and gradient descent

08/15/2020
by   Stephen Whitelam, et al.
0

We show analytically that training a neural network by stochastic mutation or "neuroevolution" of its weights is equivalent, in the limit of small mutations, to gradient descent on the loss function in the presence of Gaussian white noise. Averaged over independent realizations of the learning process, neuroevolution is equivalent to gradient descent on the loss function. We use numerical simulation to show that this correspondence can be observed for finite mutations. Our results provide a connection between two distinct types of neural-network training, and provide justification for the empirical success of neuroevolution.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

12/08/2020

Convergence Rates for Multi-classs Logistic Regression Near Minimum

Training a neural network is typically done via variations of gradient d...
05/15/2021

Gradient Descent in Materio

Deep learning, a multi-layered neural network approach inspired by the b...
07/27/2020

Universality of Gradient Descent Neural Network Training

It has been observed that design choices of neural networks are often cr...
02/26/2021

Gradient Descent on Neural Networks Typically Occurs at the Edge of Stability

We empirically demonstrate that full-batch gradient descent on neural ne...
03/16/2021

Learning without gradient descent encoded by the dynamics of a neurobiological model

The success of state-of-the-art machine learning is essentially all base...
11/27/2020

Deep orthogonal linear networks are shallow

We consider the problem of training a deep orthogonal linear network, wh...
09/08/2020

Empirical Strategy for Stretching Probability Distribution in Neural-network-based Regression

In regression analysis under artificial neural networks, the prediction ...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.