Overcoming Long-term Catastrophic Forgetting through Adversarial Neural Pruning and Synaptic Consolidation

12/19/2019
by   Jian Peng Bo Tang, et al.
11

Enabling a neural network to sequentially learn multiple tasks is of great significance for expanding the applicability of neural networks in realistic human application scenarios. However, as the task sequence increases, the model quickly forgets previously learned skills; we refer to this loss of memory of long sequences as long-term catastrophic forgetting. There are two main reasons for the long-term forgetting: first, as the tasks increase, the intersection of the low-error parameter subspace satisfying these tasks will become smaller and smaller or even non-existent; The second is the cumulative error in the process of protecting the knowledge of previous tasks. This paper, we propose a confrontation mechanism in which neural pruning and synaptic consolidation are used to overcome long-term catastrophic forgetting. This mechanism distills task-related knowledge into a small number of parameters, and retains the old knowledge by consolidating a small number of parameters, while sparing most parameters to learn the follow-up tasks, which not only avoids forgetting but also can learn a large number of tasks. Specifically, the neural pruning iteratively relaxes the parameter conditions of the current task to expand the common parameter subspace of tasks; The modified synaptic consolidation strategy is comprised of two components, a novel network structure information considered measurement is proposed to calculate the parameter importance, and a element-wise parameter updating strategy that is designed to prevent significant parameters being overridden in subsequent learning. We verified the method on image classification, and the results showed that our proposed ANPSC approach outperforms the state-of-the-art methods. The hyperparametric sensitivity test further demonstrates the robustness of our proposed approach.

READ FULL TEXT

page 1

page 7

page 10

research
12/04/2018

Overcoming Catastrophic Forgetting by Soft Parameter Pruning

Catastrophic forgetting is a challenge issue in continual learning when ...
research
03/02/2019

Attention-Based Structural-Plasticity

Catastrophic forgetting/interference is a critical problem for lifelong ...
research
02/22/2018

Overcoming Catastrophic Forgetting in Convolutional Neural Networks by Selective Network Augmentation

Lifelong learning aims to develop machine learning systems that can lear...
research
04/24/2021

Piggyback GAN: Efficient Lifelong Learning for Image Conditioned Generation

Humans accumulate knowledge in a lifelong fashion. Modern deep neural ne...
research
11/27/2017

Memory Aware Synapses: Learning what (not) to forget

Humans can learn in a continuous manner. Old rarely utilized knowledge c...
research
05/02/2017

Analyzing Knowledge Transfer in Deep Q-Networks for Autonomously Handling Multiple Intersections

We analyze how the knowledge to autonomously handle one type of intersec...
research
09/25/2019

Learning with Long-term Remembering: Following the Lead of Mixed Stochastic Gradient

Current deep neural networks can achieve remarkable performance on a sin...

Please sign up or login with your details

Forgot password? Click here to reset