Discriminative Distillation to Reduce Class Confusion in Continual Learning

08/11/2021
by   Changhong Zhong, et al.
0

Successful continual learning of new knowledge would enable intelligent systems to recognize more and more classes of objects. However, current intelligent systems often fail to correctly recognize previously learned classes of objects when updated to learn new classes. It is widely believed that such downgraded performance is solely due to the catastrophic forgetting of previously learned knowledge. In this study, we argue that the class confusion phenomena may also play a role in downgrading the classification performance during continual learning, i.e., the high similarity between new classes and any previously learned classes would also cause the classifier to make mistakes in recognizing these old classes, even if the knowledge of these old classes is not forgotten. To alleviate the class confusion issue, we propose a discriminative distillation strategy to help the classify well learn the discriminative features between confusing classes during continual learning. Experiments on multiple natural image classification tasks support that the proposed distillation strategy, when combined with existing methods, is effective in further improving continual learning.

READ FULL TEXT
research
04/28/2021

Preserving Earlier Knowledge in Continual Learning with the Help of All Previous Feature Extractors

Continual learning of new knowledge over time is one desirable capabilit...
research
03/26/2023

Preserving Linear Separability in Continual Learning by Backward Feature Projection

Catastrophic forgetting has been a major challenge in continual learning...
research
06/03/2022

Effects of Auxiliary Knowledge on Continual Learning

In Continual Learning (CL), a neural network is trained on a stream of d...
research
06/08/2020

Continual Representation Learning for Biometric Identification

With the explosion of digital data in recent years, continuously learnin...
research
06/06/2022

Remember the Past: Distilling Datasets into Addressable Memories for Neural Networks

We propose an algorithm that compresses the critical information of a la...
research
03/24/2023

Leveraging Old Knowledge to Continually Learn New Classes in Medical Images

Class-incremental continual learning is a core step towards developing a...
research
09/03/2021

Complementary Calibration: Boosting General Continual Learning with Collaborative Distillation and Self-Supervision

General Continual Learning (GCL) aims at learning from non independent a...

Please sign up or login with your details

Forgot password? Click here to reset