Generative Feature Replay For Class-Incremental Learning

04/20/2020
by   Xialei Liu, et al.
0

Humans are capable of learning new tasks without forgetting previous ones, while neural networks fail due to catastrophic forgetting between new and previously-learned tasks. We consider a class-incremental setting which means that the task-ID is unknown at inference time. The imbalance between old and new classes typically results in a bias of the network towards the newest ones. This imbalance problem can either be addressed by storing exemplars from previous tasks, or by using image replay methods. However, the latter can only be applied to toy datasets since image generation for complex datasets is a hard problem. We propose a solution to the imbalance problem based on generative feature replay which does not require any exemplars. To do this, we split the network into two parts: a feature extractor and a classifier. To prevent forgetting, we combine generative feature replay in the classifier with feature distillation in the feature extractor. Through feature generation, our method reduces the complexity of generative replay and prevents the imbalance problem. Our approach is computationally efficient and scalable to large datasets. Experiments confirm that our approach achieves state-of-the-art results on CIFAR-100 and ImageNet, while requiring only a fraction of the storage needed for exemplar-based continual learning. Code available at <https://github.com/xialeiliu/GFR-IL>.

READ FULL TEXT
research
05/07/2020

Generative Feature Replay with Orthogonal Weight Modification for Continual Learning

The ability of intelligent agents to learn and remember multiple tasks s...
research
06/09/2021

Match What Matters: Generative Implicit Feature Replay for Continual Learning

Neural networks are prone to catastrophic forgetting when trained increm...
research
04/20/2023

eTag: Class-Incremental Learning with Embedding Distillation and Task-Oriented Generation

Class-Incremental Learning (CIL) aims to solve the neural networks' cata...
research
04/06/2021

Hypothesis-driven Stream Learning with Augmented Memory

Stream learning refers to the ability to acquire and transfer knowledge ...
research
05/19/2023

AttriCLIP: A Non-Incremental Learner for Incremental Knowledge Learning

Continual learning aims to enable a model to incrementally learn knowled...
research
10/06/2019

REMIND Your Neural Network to Prevent Catastrophic Forgetting

In lifelong machine learning, a robotic agent must be incrementally upda...
research
08/25/2023

Dynamic Residual Classifier for Class Incremental Learning

The rehearsal strategy is widely used to alleviate the catastrophic forg...

Please sign up or login with your details

Forgot password? Click here to reset