Pseudo-Rehearsal for Continual Learning with Normalizing Flows

07/05/2020
by   Jary Pomponi, et al.
4

Catastrophic forgetting (CF) happens whenever a neural network overwrites past knowledge while being trained on new tasks. Common techniques to handle CF include regularization of the weights (using, e.g., their importance on past tasks), and rehearsal strategies, where the network is constantly re-trained on past data. Generative models have also been applied for the latter, in order to have endless sources of data. In this paper, we propose a novel method that combines the strengths of regularization and generative-based rehearsal approaches. Our generative model consists of a normalizing flow (NF), a probabilistic and invertible neural network, trained on the internal embeddings of the network. By keeping a single NF conditioned on the task, we show that our memory overhead remains constant. In addition, exploiting the invertibility of the NF, we propose a simple approach to regularize the network's embeddings with respect to past tasks. We show that our method performs favorably with respect to state-of-the-art approaches in the literature, with bounded computational power and memory overheads.

READ FULL TEXT
research
02/11/2022

Continual Learning with Invertible Generative Models

Catastrophic forgetting (CF) happens whenever a neural network overwrite...
research
09/09/2019

Efficient Continual Learning in Neural Networks with Embedding Regularization

Continual learning of deep neural networks is a key requirement for scal...
research
01/17/2022

Logarithmic Continual Learning

We introduce a neural network architecture that logarithmically reduces ...
research
06/03/2022

Effects of Auxiliary Knowledge on Continual Learning

In Continual Learning (CL), a neural network is trained on a stream of d...
research
03/09/2020

FoCL: Feature-Oriented Continual Learning for Generative Models

In this paper, we propose a general framework in continual learning for ...
research
03/17/2021

Gradient Projection Memory for Continual Learning

The ability to learn continually without forgetting the past tasks is a ...
research
04/30/2023

DualHSIC: HSIC-Bottleneck and Alignment for Continual Learning

Rehearsal-based approaches are a mainstay of continual learning (CL). Th...

Please sign up or login with your details

Forgot password? Click here to reset