Sparse Distributed Memory is a Continual Learner

03/20/2023
by   Trenton Bricken, et al.
0

Continual learning is a problem for artificial neural networks that their biological counterparts are adept at solving. Building on work using Sparse Distributed Memory (SDM) to connect a core neural circuit with the powerful Transformer model, we create a modified Multi-Layered Perceptron (MLP) that is a strong continual learner. We find that every component of our MLP variant translated from biology is necessary for continual learning. Our solution is also free from any memory replay or task information, and introduces novel methods to train sparse networks that may be broadly applicable.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/22/2022

Memory Bounds for Continual Learning

Continual learning, or lifelong learning, is a formidable current challe...
research
12/26/2021

Generative Kernel Continual learning

Kernel continual learning by <cit.> has recently emerged as a strong con...
research
07/15/2021

Algorithmic insights on continual learning from fruit flies

Continual learning in computational systems is challenging due to catast...
research
09/07/2022

The Role Of Biology In Deep Learning

Artificial neural networks took a lot of inspiration from their biologic...
research
03/24/2022

Continual Learning and Private Unlearning

As intelligent agents become autonomous over longer periods of time, the...
research
01/09/2023

CaSpeR: Latent Spectral Regularization for Continual Learning

While biological intelligence grows organically as new knowledge is gath...
research
12/09/2021

Gradient-matching coresets for continual learning

We devise a coreset selection method based on the idea of gradient match...

Please sign up or login with your details

Forgot password? Click here to reset