Memory Aware Synapses: Learning what (not) to forget

11/27/2017
by   Rahaf Aljundi, et al.
0

Humans can learn in a continuous manner. Old rarely utilized knowledge can be overwritten by new incoming information while important, frequently used knowledge is prevented from being erased. In artificial learning systems, lifelong learning so far has focused mainly on accumulating knowledge over tasks and overcoming catastrophic forgetting. In this paper, we argue that, given the limited model capacity and the unlimited new information to be learned, knowledge has to be preserved or erased selectively. Inspired by neuroplasticity, we propose an online method to compute the importance of the parameters of a neural network, based on the data that the network is actively applied to, in an unsupervised manner. After learning a task, whenever a sample is fed to the network, we accumulate an importance measure for each parameter of the network, based on how sensitive the predicted output is to a change in this parameter. When learning a new task, changes to important parameters are penalized. We show that a local version of our method is a direct application of Hebb's rule in identifying the important connections between neurons. We test our method on a sequence of object recognition tasks and on the challenging problem of learning an embedding in a continuous manner. We show state of the art performance and the ability to adapt the importance of the parameters towards what the network needs (not) to forget, which may be different for different test conditions.

READ FULL TEXT

page 2

page 8

research
12/04/2018

Overcoming Catastrophic Forgetting by Soft Parameter Pruning

Catastrophic forgetting is a challenge issue in continual learning when ...
research
06/22/2019

Beneficial perturbation network for continual learning

Sequential learning of multiple tasks in artificial neural networks usin...
research
12/12/2019

L3DOR: Lifelong 3D Object Recognition

3D object recognition has been widely-applied. However, most state-of-th...
research
12/19/2019

Overcoming Long-term Catastrophic Forgetting through Adversarial Neural Pruning and Synaptic Consolidation

Enabling a neural network to sequentially learn multiple tasks is of gre...
research
06/14/2018

Selfless Sequential Learning

Sequential learning studies the problem of learning tasks in a sequence ...
research
02/20/2023

InOR-Net: Incremental 3D Object Recognition Network for Point Cloud Representation

3D object recognition has successfully become an appealing research topi...
research
05/01/2018

Adaptive Scaling for Sparse Detection in Information Extraction

This paper focuses on detection tasks in information extraction, where p...

Please sign up or login with your details

Forgot password? Click here to reset