Balanced Softmax Cross-Entropy for Incremental Learning

03/23/2021
by   Quentin Jodelet, et al.
0

Deep neural networks are prone to catastrophic forgetting when incrementally trained on new classes or new tasks as adaptation to the new data leads to a drastic decrease of the performance on the old classes and tasks. By using a small memory for rehearsal and knowledge distillation, recent methods has proven to be effective to mitigate catastrophic forgetting. However due to the limited size of the memory, large imbalance between the amount of data available for the old and new classes still remains which results in a deterioration of the overall accuracy of the model. To address this problem, we propose the use of the Balanced Softmax Cross-Entropy loss and show that it can be combined with exiting methods for incremental learning to improve their performances while also decreasing the computational cost of the training procedure in some cases. Complete experiments on the competitive ImageNet, subImageNet and CIFAR100 datasets show states-of-the-art results.

READ FULL TEXT

page 3

page 7

research
07/25/2018

End-to-End Incremental Learning

Although deep learning approaches have stood out in recent years due to ...
research
12/15/2020

Class-incremental Learning with Rectified Feature-Graph Preservation

In this paper, we address the problem of distillation-based class-increm...
research
06/17/2022

TKIL: Tangent Kernel Approach for Class Balanced Incremental Learning

When learning new tasks in a sequential manner, deep neural networks ten...
research
02/02/2018

Incremental Classifier Learning with Generative Adversarial Networks

In this paper, we address the incremental classifier learning problem, w...
research
03/30/2020

Incremental Learning In Online Scenario

Modern deep learning approaches have achieved great success in many visi...
research
11/03/2020

A Comprehensive Study of Class Incremental Learning Algorithms for Visual Tasks

The ability of artificial agents to increment their capabilities when co...

Please sign up or login with your details

Forgot password? Click here to reset