Learning to Learn with Feedback and Local Plasticity

06/16/2020
by   Jack Lindsey, et al.
0

Interest in biologically inspired alternatives to backpropagation is driven by the desire to both advance connections between deep learning and neuroscience and address backpropagation's shortcomings on tasks such as online, continual learning. However, local synaptic learning rules like those employed by the brain have so far failed to match the performance of backpropagation in deep networks. In this study, we employ meta-learning to discover networks that learn using feedback connections and local, biologically inspired learning rules. Importantly, the feedback connections are not tied to the feedforward weights, avoiding biologically implausible weight transport. Our experiments show that meta-trained networks effectively use feedback connections to perform online credit assignment in multi-layer architectures. Surprisingly, this approach matches or exceeds a state-of-the-art gradient-based online meta-learning algorithm on regression and classification tasks, excelling in particular at continual learning. Analysis of the weight updates employed by these models reveals that they differ qualitatively from gradient descent in a way that reduces interference between updates. Our results suggest the existence of a class of biologically plausible learning mechanisms that not only match gradient descent-based learning, but also overcome its limitations.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

02/28/2020

Two Routes to Scalable Credit Assignment without Weight Symmetry

The neural plausibility of backpropagation has long been disputed, prima...
06/24/2018

Beyond Backprop: Alternating Minimization with co-Activation Memory

We propose a novel online algorithm for training deep feedforward neural...
09/08/2016

Learning to learn with backpropagation of Hebbian plasticity

Hebbian plasticity is a powerful principle that allows biological brains...
10/22/2018

A neuro-inspired architecture for unsupervised continual learning based on online clustering and hierarchical predictive coding

We propose that the Continual Learning desiderata can be achieved throug...
10/11/2019

Structured and Deep Similarity Matching via Structured and Deep Hebbian Networks

Synaptic plasticity is widely accepted to be the mechanism behind learni...
11/21/2017

Variational Probability Flow for Biologically Plausible Training of Deep Neural Networks

The quest for biologically plausible deep learning is driven, not just b...
06/03/2019

Learning to solve the credit assignment problem

Backpropagation is driving today's artificial neural networks (ANNs). Ho...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.