Continual Learning with Invertible Generative Models

02/11/2022
by   Jary Pomponi, et al.
0

Catastrophic forgetting (CF) happens whenever a neural network overwrites past knowledge while being trained on new tasks. Common techniques to handle CF include regularization of the weights (using, e.g., their importance on past tasks), and rehearsal strategies, where the network is constantly re-trained on past data. Generative models have also been applied for the latter, in order to have endless sources of data. In this paper, we propose a novel method that combines the strengths of regularization and generative-based rehearsal approaches. Our generative model consists of a normalizing flow (NF), a probabilistic and invertible neural network, trained on the internal embeddings of the network. By keeping a single NF throughout the training process, we show that our memory overhead remains constant. In addition, exploiting the invertibility of the NF, we propose a simple approach to regularize the network's embeddings with respect to past tasks. We show that our method performs favorably with espect to state-of-the-art approaches in the literature, with bounded computational power and memory overheads.

READ FULL TEXT
research
07/05/2020

Pseudo-Rehearsal for Continual Learning with Normalizing Flows

Catastrophic forgetting (CF) happens whenever a neural network overwrite...
research
09/09/2019

Efficient Continual Learning in Neural Networks with Embedding Regularization

Continual learning of deep neural networks is a key requirement for scal...
research
01/17/2022

Logarithmic Continual Learning

We introduce a neural network architecture that logarithmically reduces ...
research
03/09/2020

FoCL: Feature-Oriented Continual Learning for Generative Models

In this paper, we propose a general framework in continual learning for ...
research
12/03/2019

Overcoming Catastrophic Forgetting by Generative Regularization

In this paper, we propose a new method to overcome catastrophic forgetti...
research
01/22/2021

Continual Learning of Generative Models with Limited Data: From Wasserstein-1 Barycenter to Adaptive Coalescence

Learning generative models is challenging for a network edge node with l...
research
04/27/2020

Lifelong Learning Process: Self-Memory Supervising and Dynamically Growing Networks

From childhood to youth, human gradually come to know the world. But for...

Please sign up or login with your details

Forgot password? Click here to reset