Balanced Softmax Cross-Entropy for Incremental Learning

by   Quentin Jodelet, et al.

Deep neural networks are prone to catastrophic forgetting when incrementally trained on new classes or new tasks as adaptation to the new data leads to a drastic decrease of the performance on the old classes and tasks. By using a small memory for rehearsal and knowledge distillation, recent methods has proven to be effective to mitigate catastrophic forgetting. However due to the limited size of the memory, large imbalance between the amount of data available for the old and new classes still remains which results in a deterioration of the overall accuracy of the model. To address this problem, we propose the use of the Balanced Softmax Cross-Entropy loss and show that it can be combined with exiting methods for incremental learning to improve their performances while also decreasing the computational cost of the training procedure in some cases. Complete experiments on the competitive ImageNet, subImageNet and CIFAR100 datasets show states-of-the-art results.



There are no comments yet.


page 3

page 7


End-to-End Incremental Learning

Although deep learning approaches have stood out in recent years due to ...

Class-incremental Learning with Rectified Feature-Graph Preservation

In this paper, we address the problem of distillation-based class-increm...

Self-Training for Class-Incremental Semantic Segmentation

We study incremental learning for semantic segmentation where when learn...

A Comprehensive Study of Class Incremental Learning Algorithms for Visual Tasks

The ability of artificial agents to increment their capabilities when co...

Multi-Domain Multi-Task Rehearsal for Lifelong Learning

Rehearsal, seeking to remind the model by storing old knowledge in lifel...

Maintaining Discrimination and Fairness in Class Incremental Learning

Deep neural networks (DNNs) have been applied in class incremental learn...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.