Attention-Based Structural-Plasticity

03/02/2019
by   Soheil Kolouri, et al.
0

Catastrophic forgetting/interference is a critical problem for lifelong learning machines, which impedes the agents from maintaining their previously learned knowledge while learning new tasks. Neural networks, in particular, suffer plenty from the catastrophic forgetting phenomenon. Recently there has been several efforts towards overcoming catastrophic forgetting in neural networks. Here, we propose a biologically inspired method toward overcoming catastrophic forgetting. Specifically, we define an attention-based selective plasticity of synapses based on the cholinergic neuromodulatory system in the brain. We define synaptic importance parameters in addition to synaptic weights and then use Hebbian learning in parallel with backpropagation algorithm to learn synaptic importances in an online and seamless manner. We test our proposed method on benchmark tasks including the Permuted MNIST and the Split MNIST problems and show competitive performance compared to the state-of-the-art methods.

READ FULL TEXT
research
06/06/2019

Localizing Catastrophic Forgetting in Neural Networks

Artificial neural networks (ANNs) suffer from catastrophic forgetting wh...
research
07/22/2022

Revisiting Parameter Reuse to Overcome Catastrophic Forgetting in Neural Networks

Neural networks tend to forget previously learned knowledge when continu...
research
12/12/2018

An Empirical Study of Example Forgetting during Deep Neural Network Learning

Inspired by the phenomenon of catastrophic forgetting, we investigate th...
research
01/27/2020

Uncertainty-based Modulation for Lifelong Learning

The creation of machine learning algorithms for intelligent agents capab...
research
04/29/2020

Reducing catastrophic forgetting with learning on synthetic data

Catastrophic forgetting is a problem caused by neural networks' inabilit...
research
12/19/2019

Overcoming Long-term Catastrophic Forgetting through Adversarial Neural Pruning and Synaptic Consolidation

Enabling a neural network to sequentially learn multiple tasks is of gre...
research
05/03/2020

Continuous Learning in a Single-Incremental-Task Scenario with Spike Features

Deep Neural Networks (DNNs) have two key deficiencies, their dependence ...

Please sign up or login with your details

Forgot password? Click here to reset