Are skip connections necessary for biologically plausible learning rules?

12/04/2019
by   Daniel Jiwoong Im, et al.
0

Backpropagation is the workhorse of deep learning, however, several other biologically-motivated learning rules have been introduced, such as random feedback alignment and difference target propagation. None of these methods have produced a competitive performance against backpropagation. In this paper, we show that biologically-motivated learning rules with skip connections between intermediate layers can perform as well as backpropagation on the MNIST dataset and are robust to various sets of hyper-parameters.

READ FULL TEXT
research
06/12/2020

Kernelized information bottleneck leads to biologically plausible 3-factor Hebbian learning in deep networks

The state-of-the art machine learning approach to training deep neural n...
research
04/01/2022

Physical Deep Learning with Biologically Plausible Training Method

The ever-growing demand for further advances in artificial intelligence ...
research
06/13/2019

Associated Learning: Decomposing End-to-end Backpropagation based on Auto-encoders and Target Propagation

Backpropagation has been widely used in deep learning approaches, but it...
research
10/28/2022

Meta-Learning Biologically Plausible Plasticity Rules with Random Feedback Pathways

Backpropagation is widely used to train artificial neural networks, but ...
research
11/05/2018

A Biologically Plausible Learning Rule for Deep Learning in the Brain

Researchers have proposed that deep learning, which is providing importa...
research
05/26/2018

Biologically Motivated Algorithms for Propagating Local Target Representations

Finding biologically plausible alternatives to back-propagation of error...
research
10/22/2018

Learning sparse transformations through backpropagation

Many transformations in deep learning architectures are sparsely connect...

Please sign up or login with your details

Forgot password? Click here to reset