Class-Incremental Learning by Knowledge Distillation with Adaptive Feature Consolidation

04/02/2022
by   Minsoo Kang, et al.
0

We present a novel class incremental learning approach based on deep neural networks, which continually learns new tasks with limited memory for storing examples in the previous tasks. Our algorithm is based on knowledge distillation and provides a principled way to maintain the representations of old models while adjusting to new tasks effectively. The proposed method estimates the relationship between the representation changes and the resulting loss increases incurred by model updates. It minimizes the upper bound of the loss increases using the representations, which exploits the estimated importance of each feature map within a backbone model. Based on the importance, the model restricts updates of important features for robustness while allowing changes in less critical features for flexibility. This optimization strategy effectively alleviates the notorious catastrophic forgetting problem despite the limited accessibility of data in the previous tasks. The experimental results show significant accuracy improvement of the proposed algorithm over the existing methods on the standard datasets. Code is available.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/17/2021

On Learning the Geodesic Path for Incremental Learning

Neural networks notoriously suffer from the problem of catastrophic forg...
research
04/03/2019

M2KD: Multi-model and Multi-level Knowledge Distillation for Incremental Learning

Incremental learning targets at achieving good performance on new catego...
research
03/26/2022

Uncertainty-aware Contrastive Distillation for Incremental Semantic Segmentation

A fundamental and challenging problem in deep learning is catastrophic f...
research
04/11/2023

Density Map Distillation for Incremental Object Counting

We investigate the problem of incremental learning for object counting, ...
research
09/01/2022

A New Knowledge Distillation Network for Incremental Few-Shot Surface Defect Detection

Surface defect detection is one of the most essential processes for indu...
research
04/04/2023

Cross-Class Feature Augmentation for Class Incremental Learning

We propose a novel class incremental learning approach by incorporating ...
research
03/06/2021

Semantic-aware Knowledge Distillation for Few-Shot Class-Incremental Learning

Few-shot class incremental learning (FSCIL) portrays the problem of lear...

Please sign up or login with your details

Forgot password? Click here to reset