Triple Memory Networks: a Brain-Inspired Method for Continual Learning

03/06/2020
by   Liyuan Wang, et al.
0

Continual acquisition of novel experience without interfering previously learned knowledge, i.e. continual learning, is critical for artificial neural networks, but limited by catastrophic forgetting. A neural network adjusts its parameters when learning a new task, but then fails to conduct the old tasks well. By contrast, the brain has a powerful ability to continually learn new experience without catastrophic interference. The underlying neural mechanisms possibly attribute to the interplay of hippocampus-dependent memory system and neocortex-dependent memory system, mediated by prefrontal cortex. Specifically, the two memory systems develop specialized mechanisms to consolidate information as more specific forms and more generalized forms, respectively, and complement the two forms of information in the interplay. Inspired by such brain strategy, we propose a novel approach named triple memory networks (TMNs) for continual learning. TMNs model the interplay of hippocampus, prefrontal cortex and sensory cortex (a neocortex region) as a triple-network architecture of generative adversarial networks (GAN). The input information is encoded as specific representation of the data distributions in a generator, or generalized knowledge of solving tasks in a discriminator and a classifier, with implementing appropriate brain-inspired algorithms to alleviate catastrophic forgetting in each module. Particularly, the generator replays generated data of the learned tasks to the discriminator and the classifier, both of which are implemented with a weight consolidation regularizer to complement the lost information in generation process. TMNs achieve new state-of-the-art performance on a variety of class-incremental learning benchmarks on MNIST, SVHN, CIFAR-10 and ImageNet-50, comparing with strong baseline methods.

READ FULL TEXT
research
11/16/2020

Gradient Episodic Memory with a Soft Constraint for Continual Learning

Catastrophic forgetting in continual learning is a common destructive ph...
research
06/16/2018

DynMat, a network that can learn after learning

To survive in the dynamically-evolving world, we accumulate knowledge an...
research
01/17/2023

Artificial Neuronal Ensembles with Learned Context Dependent Gating

Biological neural networks are capable of recruiting different sets of n...
research
09/29/2018

Continuous Learning of Context-dependent Processing in Neural Networks

Deep artificial neural networks (DNNs) are powerful tools for recognitio...
research
10/26/2021

Brain-inspired feature exaggeration in generative replay for continual learning

The catastrophic forgetting of previously learnt classes is one of the m...
research
09/18/2023

Adaptive Reorganization of Neural Pathways for Continual Learning with Hybrid Spiking Neural Networks

The human brain can self-organize rich and diverse sparse neural pathway...
research
10/09/2021

Cognitively Inspired Learning of Incremental Drifting Concepts

Humans continually expand their learned knowledge to new domains and lea...

Please sign up or login with your details

Forgot password? Click here to reset