Faster Biological Gradient Descent Learning

09/27/2020
by   Ho Ling Li, et al.
0

Back-propagation is a popular machine learning algorithm that uses gradient descent in training neural networks for supervised learning, but can be very slow. A number of algorithms have been developed to speed up convergence and improve robustness of the learning. However, they are complicated to implement biologically as they require information from previous updates. Inspired by synaptic competition in biology, we have come up with a simple and local gradient descent optimization algorithm that can reduce training time, with no demand on past details. Our algorithm, named dynamic learning rate (DLR), works similarly to the traditional gradient descent used in back-propagation, except that instead of having a uniform learning rate across all synapses, the learning rate depends on the current neuronal connection weights. Our algorithm is found to speed up learning, particularly for small networks.

READ FULL TEXT
research
01/27/2018

Gradient descent revisited via an adaptive online learning rate

Any gradient descent optimization requires to choose a learning rate. Wi...
research
12/08/2012

Hybrid Optimized Back propagation Learning Algorithm For Multi-layer Perceptron

Standard neural network based on general back propagation learning using...
research
03/16/2022

Gradient Correction beyond Gradient Descent

The great success neural networks have achieved is inseparable from the ...
research
06/25/2020

Learning compositional functions via multiplicative weight updates

Compositionality is a basic structural feature of both biological and ar...
research
04/11/2022

Position-wise optimizer: A nature-inspired optimization algorithm

The human nervous system utilizes synaptic plasticity to solve optimizat...
research
05/11/2019

Linear Range in Gradient Descent

This paper defines linear range as the range of parameter perturbations ...
research
10/15/2020

Neograd: gradient descent with an adaptive learning rate

Since its inception by Cauchy in 1847, the gradient descent algorithm ha...

Please sign up or login with your details

Forgot password? Click here to reset