Learning to Learn with Feedback and Local Plasticity

06/16/2020
by   Jack Lindsey, et al.
0

Interest in biologically inspired alternatives to backpropagation is driven by the desire to both advance connections between deep learning and neuroscience and address backpropagation's shortcomings on tasks such as online, continual learning. However, local synaptic learning rules like those employed by the brain have so far failed to match the performance of backpropagation in deep networks. In this study, we employ meta-learning to discover networks that learn using feedback connections and local, biologically inspired learning rules. Importantly, the feedback connections are not tied to the feedforward weights, avoiding biologically implausible weight transport. Our experiments show that meta-trained networks effectively use feedback connections to perform online credit assignment in multi-layer architectures. Surprisingly, this approach matches or exceeds a state-of-the-art gradient-based online meta-learning algorithm on regression and classification tasks, excelling in particular at continual learning. Analysis of the weight updates employed by these models reveals that they differ qualitatively from gradient descent in a way that reduces interference between updates. Our results suggest the existence of a class of biologically plausible learning mechanisms that not only match gradient descent-based learning, but also overcome its limitations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/28/2022

Meta-Learning Biologically Plausible Plasticity Rules with Random Feedback Pathways

Backpropagation is widely used to train artificial neural networks, but ...
research
02/28/2020

Two Routes to Scalable Credit Assignment without Weight Symmetry

The neural plausibility of backpropagation has long been disputed, prima...
research
06/24/2018

Beyond Backprop: Alternating Minimization with co-Activation Memory

We propose a novel online algorithm for training deep feedforward neural...
research
04/14/2022

Minimizing Control for Credit Assignment with Strong Feedback

The success of deep learning attracted interest in whether the brain lea...
research
09/08/2016

Learning to learn with backpropagation of Hebbian plasticity

Hebbian plasticity is a powerful principle that allows biological brains...
research
05/24/2022

Thalamus: a brain-inspired algorithm for biologically-plausible continual learning and disentangled representations

Animals thrive in a constantly changing environment and leverage the tem...
research
10/11/2019

Structured and Deep Similarity Matching via Structured and Deep Hebbian Networks

Synaptic plasticity is widely accepted to be the mechanism behind learni...

Please sign up or login with your details

Forgot password? Click here to reset