DeepAI AI Chat
Log In Sign Up

Alleviating catastrophic forgetting using context-dependent gating and synaptic stabilization

by   Nicolas Y. Masse, et al.
The University of Chicago

Humans and most animals can learn new tasks without forgetting old ones. However, training artificial neural networks (ANNs) on new tasks typically cause it to forget previously learned tasks. This phenomenon is the result of "catastrophic forgetting", in which training an ANN disrupts connection weights that were important for solving previous tasks, degrading task performance. Several recent studies have proposed methods to stabilize connection weights of ANNs that are deemed most important for solving a task, which helps alleviate catastrophic forgetting. Here, drawing inspiration from algorithms that are believed to be implemented in vivo, we propose a complementary method: adding a context-dependent gating signal, such that only sparse, mostly non-overlapping patterns of units are active for any one task. This method is easy to implement, requires little computational overhead, and allows ANNs to maintain high performance across large numbers of sequentially presented tasks when combined with weight stabilization. This work provides another example of how neuroscience-inspired algorithms can benefit ANN design and capability.


Localizing Catastrophic Forgetting in Neural Networks

Artificial neural networks (ANNs) suffer from catastrophic forgetting wh...

Synaptic Metaplasticity in Binarized Neural Networks

While deep neural networks have surpassed human performance in multiple ...

Overcoming catastrophic forgetting in neural networks

The ability to learn tasks in a sequential fashion is crucial to the dev...

Artificial Neural Variability for Deep Learning: On Overfitting, Noise Memorization, and Catastrophic Forgetting

Deep learning is often criticized by two serious issues which rarely exi...

Artificial Neuronal Ensembles with Learned Context Dependent Gating

Biological neural networks are capable of recruiting different sets of n...

A sparse code for neuro-dynamic programming and optimal control

Sparse codes have been suggested to offer certain computational advantag...

The Role of Bio-Inspired Modularity in General Learning

One goal of general intelligence is to learn novel information without o...

Code Repositories


Algorithm to alleviate catastrophic forgetting in neural networks by gating hidden units

view repo