Sparse Coding in a Dual Memory System for Lifelong Learning

12/28/2022
by   Fahad Sarfraz, et al.
0

Efficient continual learning in humans is enabled by a rich set of neurophysiological mechanisms and interactions between multiple memory systems. The brain efficiently encodes information in non-overlapping sparse codes, which facilitates the learning of new associations faster with controlled interference with previous associations. To mimic sparse coding in DNNs, we enforce activation sparsity along with a dropout mechanism which encourages the model to activate similar units for semantically similar inputs and have less overlap with activation patterns of semantically dissimilar inputs. This provides us with an efficient mechanism for balancing the reusability and interference of features, depending on the similarity of classes across tasks. Furthermore, we employ sparse coding in a multiple-memory replay mechanism. Our method maintains an additional long-term semantic memory that aggregates and consolidates information encoded in the synaptic weights of the working model. Our extensive evaluation and characteristics analysis show that equipped with these biologically inspired mechanisms, the model can further mitigate forgetting.

READ FULL TEXT
research
06/08/2022

SYNERgy between SYNaptic consolidation and Experience Replay for general continual learning

Continual learning (CL) in the brain is facilitated by a complex set of ...
research
04/13/2023

A Study of Biologically Plausible Neural Network: The Role and Interactions of Brain-Inspired Mechanisms in Continual Learning

Humans excel at continually acquiring, consolidating, and retaining info...
research
03/12/2022

Sparsity and Heterogeneous Dropout for Continual Learning in the Null Space of Neural Activations

Continual/lifelong learning from a non-stationary input data stream is a...
research
07/15/2021

Algorithmic insights on continual learning from fruit flies

Continual learning in computational systems is challenging due to catast...
research
12/08/2022

Bio-Inspired, Task-Free Continual Learning through Activity Regularization

The ability to sequentially learn multiple tasks without forgetting is a...
research
01/05/2023

Competitive learning to generate sparse representations for associative memory

One of the most well established brain principles, hebbian learning, has...
research
02/09/2022

A Neural Network Model of Continual Learning with Cognitive Control

Neural networks struggle in continual learning settings from catastrophi...

Please sign up or login with your details

Forgot password? Click here to reset