Overcoming Catastrophic Forgetting with Gaussian Mixture Replay

04/19/2021
by   Benedikt Pfülb, et al.
18

We present Gaussian Mixture Replay (GMR), a rehearsal-based approach for continual learning (CL) based on Gaussian Mixture Models (GMM). CL approaches are intended to tackle the problem of catastrophic forgetting (CF), which occurs for Deep Neural Networks (DNNs) when sequentially training them on successive sub-tasks. GMR mitigates CF by generating samples from previous tasks and merging them with current training data. GMMs serve several purposes here: sample generation, density estimation (e.g., for detecting outliers or recognizing task boundaries) and providing a high-level feature representation for classification. GMR has several conceptual advantages over existing replay-based CL approaches. First of all, GMR achieves sample generation, classification and density estimation in a single network structure with strongly reduced memory requirements. Secondly, it can be trained at constant time complexity w.r.t. the number of sub-tasks, making it particularly suitable for life-long learning. Furthermore, GMR minimizes a differentiable loss function and seems to avoid mode collapse. In addition, task boundaries can be detected by applying GMM density estimation. Lastly, GMR does not require access to sub-tasks lying in the future for hyper-parameter tuning, allowing CL under real-world constraints. We evaluate GMR on multiple image datasets, which are divided into class-disjoint sub-tasks.

READ FULL TEXT

page 1

page 3

page 6

page 7

research
04/19/2021

Continual Learning with Fully Probabilistic Models

We present an approach for continual learning (CL) that is based on full...
research
07/23/2019

Lifelong GAN: Continual Learning for Conditional Image Generation

Lifelong learning is challenging for deep neural networks due to their s...
research
05/24/2023

Dealing with Cross-Task Class Discrimination in Online Continual Learning

Existing continual learning (CL) research regards catastrophic forgettin...
research
08/15/2021

An Investigation of Replay-based Approaches for Continual Learning

Continual learning (CL) is a major challenge of machine learning (ML) an...
research
11/22/2021

FFNB: Forgetting-Free Neural Blocks for Deep Continual Visual Learning

Deep neural networks (DNNs) have recently achieved a great success in co...
research
08/22/2023

Variational Density Propagation Continual Learning

Deep Neural Networks (DNNs) deployed to the real world are regularly sub...
research
06/10/2020

Gaussian Gated Linear Networks

We propose the Gaussian Gated Linear Network (G-GLN), an extension to th...

Please sign up or login with your details

Forgot password? Click here to reset