Decomposed Knowledge Distillation for Class-Incremental Semantic Segmentation

10/12/2022
by   Donghyeon Baek, et al.
0

Class-incremental semantic segmentation (CISS) labels each pixel of an image with a corresponding object/stuff class continually. To this end, it is crucial to learn novel classes incrementally without forgetting previously learned knowledge. Current CISS methods typically use a knowledge distillation (KD) technique for preserving classifier logits, or freeze a feature extractor, to avoid the forgetting problem. The strong constraints, however, prevent learning discriminative features for novel classes. We introduce a CISS framework that alleviates the forgetting problem and facilitates learning novel classes effectively. We have found that a logit can be decomposed into two terms. They quantify how likely an input belongs to a particular class or not, providing a clue for a reasoning process of a model. The KD technique, in this context, preserves the sum of two terms (i.e., a class logit), suggesting that each could be changed and thus the KD does not imitate the reasoning process. To impose constraints on each term explicitly, we propose a new decomposed knowledge distillation (DKD) technique, improving the rigidity of a model and addressing the forgetting problem more effectively. We also introduce a novel initialization method to train new classifiers for novel classes. In CISS, the number of negative training samples for novel classes is not sufficient to discriminate old classes. To mitigate this, we propose to transfer knowledge of negatives to the classifiers successively using an auxiliary classifier, boosting the performance significantly. Experimental results on standard CISS benchmarks demonstrate the effectiveness of our framework.

READ FULL TEXT

page 8

page 10

page 15

research
11/08/2019

Knowledge Distillation for Incremental Learning in Semantic Segmentation

Although deep learning architectures have shown remarkable results in sc...
research
04/03/2019

M2KD: Multi-model and Multi-level Knowledge Distillation for Incremental Learning

Incremental learning targets at achieving good performance on new catego...
research
05/17/2021

Class-Incremental Few-Shot Object Detection

Conventional detection networks usually need abundant labeled training s...
research
08/04/2020

Memory Efficient Class-Incremental Learning for Image Classification

With the memory-resource-limited constraints, class-incremental learning...
research
08/07/2022

Class-Incremental Learning with Cross-Space Clustering and Controlled Transfer

In class-incremental learning, the model is expected to learn new classe...
research
03/28/2022

Doodle It Yourself: Class Incremental Learning by Drawing a Few Sketches

The human visual system is remarkable in learning new visual concepts fr...
research
02/12/2023

SCLIFD:Supervised Contrastive Knowledge Distillation for Incremental Fault Diagnosis under Limited Fault Data

Intelligent fault diagnosis has made extraordinary advancements currentl...

Please sign up or login with your details

Forgot password? Click here to reset