Condensed Composite Memory Continual Learning

02/19/2021
by   Felix Wiewel, et al.
0

Deep Neural Networks (DNNs) suffer from a rapid decrease in performance when trained on a sequence of tasks where only data of the most recent task is available. This phenomenon, known as catastrophic forgetting, prevents DNNs from accumulating knowledge over time. Overcoming catastrophic forgetting and enabling continual learning is of great interest since it would enable the application of DNNs in settings where unrestricted access to all the training data at any time is not always possible, e.g. due to storage limitations or legal issues. While many recently proposed methods for continual learning use some training examples for rehearsal, their performance strongly depends on the number of stored examples. In order to improve performance of rehearsal for continual learning, especially for a small number of stored examples, we propose a novel way of learning a small set of synthetic examples which capture the essence of a complete dataset. Instead of directly learning these synthetic examples, we learn a weighted combination of shared components for each example that enables a significant increase in memory efficiency. We demonstrate the performance of our method on commonly used datasets and compare it to recently proposed related methods and baselines.

READ FULL TEXT
research
12/03/2018

Few-Shot Self Reminder to Overcome Catastrophic Forgetting

Deep neural networks are known to suffer the catastrophic forgetting pro...
research
07/12/2022

Continual Learning with Deep Learning Methods in an Application-Oriented Context

Abstract knowledge is deeply grounded in many computer-based application...
research
07/14/2022

In-memory Realization of In-situ Few-shot Continual Learning with a Dynamically Evolving Explicit Memory

Continually learning new classes from a few training examples without fo...
research
09/16/2022

Continual Learning with Dependency Preserving Hypernetworks

Humans learn continually throughout their lifespan by accumulating diver...
research
02/04/2021

Rethinking Quadratic Regularizers: Explicit Movement Regularization for Continual Learning

Quadratic regularizers are often used for mitigating catastrophic forget...
research
05/05/2021

Schematic Memory Persistence and Transience for Efficient and Robust Continual Learning

Continual learning is considered a promising step towards next-generatio...
research
01/13/2021

EEC: Learning to Encode and Regenerate Images for Continual Learning

The two main impediments to continual learning are catastrophic forgetti...

Please sign up or login with your details

Forgot password? Click here to reset