ClaRe: Practical Class Incremental Learning By Remembering Previous Class Representations

03/29/2021
by   Bahram Mohammadi, et al.
4

This paper presents a practical and simple yet efficient method to effectively deal with the catastrophic forgetting for Class Incremental Learning (CIL) tasks. CIL tends to learn new concepts perfectly, but not at the expense of performance and accuracy for old data. Learning new knowledge in the absence of data instances from previous classes or even imbalance samples of both old and new classes makes CIL an ongoing challenging problem. These issues can be tackled by storing exemplars belonging to the previous tasks or by utilizing the rehearsal strategy. Inspired by the rehearsal strategy with the approach of using generative models, we propose ClaRe, an efficient solution for CIL by remembering the representations of learned classes in each increment. Taking this approach leads to generating instances with the same distribution of the learned classes. Hence, our model is somehow retrained from the scratch using a new training set including both new and the generated samples. Subsequently, the imbalance data problem is also solved. ClaRe has a better generalization than prior methods thanks to producing diverse instances from the distribution of previously learned classes. We comprehensively evaluate ClaRe on the MNIST benchmark. Results show a very low degradation on accuracy against facing new knowledge over time. Furthermore, contrary to the most proposed solutions, the memory limitation is not problematic any longer which is considered as a consequential issue in this research area.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/08/2022

Self-Paced Imbalance Rectification for Class Incremental Learning

Exemplar-based class-incremental learning is to recognize new classes wh...
research
06/30/2022

Multi-Granularity Regularized Re-Balancing for Class Incremental Learning

Deep learning models suffer from catastrophic forgetting when learning n...
research
08/11/2022

Memorizing Complementation Network for Few-Shot Class-Incremental Learning

Few-shot Class-Incremental Learning (FSCIL) aims at learning new concept...
research
03/24/2023

Leveraging Old Knowledge to Continually Learn New Classes in Medical Images

Class-incremental continual learning is a core step towards developing a...
research
07/03/2022

Memory-Based Label-Text Tuning for Few-Shot Class-Incremental Learning

Few-shot class-incremental learning(FSCIL) focuses on designing learning...
research
05/30/2019

Large Scale Incremental Learning

Modern machine learning suffers from catastrophic forgetting when learni...
research
04/21/2021

IB-DRR: Incremental Learning with Information-Back Discrete Representation Replay

Incremental learning aims to enable machine learning models to continuou...

Please sign up or login with your details

Forgot password? Click here to reset