Gradient Episodic Memory with a Soft Constraint for Continual Learning

11/16/2020
by   Guannan Hu, et al.
0

Catastrophic forgetting in continual learning is a common destructive phenomenon in gradient-based neural networks that learn sequential tasks, and it is much different from forgetting in humans, who can learn and accumulate knowledge throughout their whole lives. Catastrophic forgetting is the fatal shortcoming of a large decrease in performance on previous tasks when the model is learning a novel task. To alleviate this problem, the model should have the capacity to learn new knowledge and preserve learned knowledge. We propose an average gradient episodic memory (A-GEM) with a soft constraint ϵ∈ [0, 1], which is a balance factor between learning new knowledge and preserving learned knowledge; our method is called gradient episodic memory with a soft constraint ϵ (ϵ-SOFT-GEM). ϵ-SOFT-GEM outperforms A-GEM and several continual learning benchmarks in a single training epoch; additionally, it has state-of-the-art average accuracy and efficiency for computation and memory, like A-GEM, and provides a better trade-off between the stability of preserving learned knowledge and the plasticity of learning new knowledge.

READ FULL TEXT
research
12/03/2018

Few-Shot Self Reminder to Overcome Catastrophic Forgetting

Deep neural networks are known to suffer the catastrophic forgetting pro...
research
03/06/2020

Triple Memory Networks: a Brain-Inspired Method for Continual Learning

Continual acquisition of novel experience without interfering previously...
research
10/09/2021

Flattening Sharpness for Dynamic Gradient Projection Memory Benefits Continual Learning

The backpropagation networks are notably susceptible to catastrophic for...
research
10/14/2021

Carousel Memory: Rethinking the Design of Episodic Memory for Continual Learning

Continual Learning (CL) is an emerging machine learning paradigm that ai...
research
05/28/2019

Single-Net Continual Learning with Progressive Segmented Training (PST)

There is an increasing need of continual learning in dynamic systems, su...
research
11/28/2022

Progressive Learning without Forgetting

Learning from changing tasks and sequential experience without forgettin...
research
07/11/2022

Consistency is the key to further mitigating catastrophic forgetting in continual learning

Deep neural networks struggle to continually learn multiple sequential t...

Please sign up or login with your details

Forgot password? Click here to reset