Algorithmic insights on continual learning from fruit flies

07/15/2021
by   Yang Shen, et al.
0

Continual learning in computational systems is challenging due to catastrophic forgetting. We discovered a two layer neural circuit in the fruit fly olfactory system that addresses this challenge by uniquely combining sparse coding and associative learning. In the first layer, odors are encoded using sparse, high dimensional representations, which reduces memory interference by activating non overlapping populations of neurons for different odors. In the second layer, only the synapses between odor activated neurons and the output neuron associated with the odor are modified during learning; the rest of the weights are frozen to prevent unrelated memories from being overwritten. We show empirically and analytically that this simple and lightweight algorithm significantly boosts continual learning performance. The fly associative learning algorithm is strikingly similar to the classic perceptron learning algorithm, albeit two modifications, which we show are critical for reducing catastrophic forgetting. Overall, fruit flies evolved an efficient lifelong learning algorithm, and circuit mechanisms from neuroscience can be translated to improve machine computation.

READ FULL TEXT

page 15

page 16

research
01/02/2023

Dynamically Modular and Sparse General Continual Learning

Real-world applications often require learning continuously from a strea...
research
11/25/2022

Overcoming Catastrophic Forgetting by XAI

Explaining the behaviors of deep neural networks, usually considered as ...
research
06/03/2021

Continual Learning in Deep Networks: an Analysis of the Last Layer

We study how different output layer types of a deep neural network learn...
research
03/20/2023

Sparse Distributed Memory is a Continual Learner

Continual learning is a problem for artificial neural networks that thei...
research
12/08/2022

Bio-Inspired, Task-Free Continual Learning through Activity Regularization

The ability to sequentially learn multiple tasks without forgetting is a...
research
12/28/2022

Sparse Coding in a Dual Memory System for Lifelong Learning

Efficient continual learning in humans is enabled by a rich set of neuro...
research
03/27/2023

CoDeC: Communication-Efficient Decentralized Continual Learning

Training at the edge utilizes continuously evolving data generated at di...

Please sign up or login with your details

Forgot password? Click here to reset