DeepAI AI Chat
Log In Sign Up

Marginal Replay vs Conditional Replay for Continual Learning

10/29/2018
by   Timothée Lesort, et al.
0

We present a new replay-based method of continual classification learning that we term "conditional replay" which generates samples and labels together by sampling from a distribution conditioned on the class. We compare conditional replay to another replay-based continual learning paradigm (which we term "marginal replay") that generates samples independently of their class and assigns labels in a separate step. The main improvement in conditional replay is that labels for generated samples need not be inferred, which reduces the margin for error in complex continual classification learning tasks. We demonstrate the effectiveness of this approach using novel and standard benchmarks constructed from MNIST and FashionMNIST data, and compare to the regularization-based EWC method.

READ FULL TEXT

page 8

page 9

page 10

page 11

01/15/2023

Self-recovery of memory via generative replay

A remarkable capacity of the brain is its ability to autonomously reorga...
08/11/2019

Online Continual Learning with Maximally Interfered Retrieval

Continual learning, the setting where a learning agent is faced with a n...
10/06/2020

The Effectiveness of Memory Replay in Large Scale Continual Learning

We study continual learning in the large scale setting where tasks in th...
09/08/2022

nVFNet-RDC: Replay and Non-Local Distillation Collaboration for Continual Object Detection

Continual Learning (CL) focuses on developing algorithms with the abilit...
05/30/2023

Class Conditional Gaussians for Continual Learning

Dealing with representation shift is one of the main problems in online ...
09/27/2018

Generative replay with feedback connections as a general strategy for continual learning

Standard artificial neural networks suffer from the well-known issue of ...
03/23/2023

Adiabatic replay for continual learning

Conventional replay-based approaches to continual learning (CL) require,...