Uncertainty-aware Contrastive Distillation for Incremental Semantic Segmentation

03/26/2022
by   Guanglei Yang, et al.
17

A fundamental and challenging problem in deep learning is catastrophic forgetting, i.e. the tendency of neural networks to fail to preserve the knowledge acquired from old tasks when learning new tasks. This problem has been widely investigated in the research community and several Incremental Learning (IL) approaches have been proposed in the past years. While earlier works in computer vision have mostly focused on image classification and object detection, more recently some IL approaches for semantic segmentation have been introduced. These previous works showed that, despite its simplicity, knowledge distillation can be effectively employed to alleviate catastrophic forgetting. In this paper, we follow this research direction and, inspired by recent literature on contrastive learning, we propose a novel distillation framework, Uncertainty-aware Contrastive Distillation (). In a nutshell,  is operated by introducing a novel distillation loss that takes into account all the images in a mini-batch, enforcing similarity between features associated to all the pixels from the same classes, and pulling apart those corresponding to pixels from different classes. In order to mitigate catastrophic forgetting, we contrast features of the new model with features extracted by a frozen model learned at the previous incremental step. Our experimental results demonstrate the advantage of the proposed distillation technique, which can be used in synergy with previous IL approaches, and leads to state-of-art performance on three commonly adopted benchmarks for incremental semantic segmentation. The code is available at <https://github.com/ygjwd12345/UCD>.

READ FULL TEXT

page 1

page 5

page 9

page 10

page 11

page 12

page 15

research
12/07/2021

A Contrastive Distillation Approach for Incremental Semantic Segmentation in Aerial Images

Incremental learning represents a crucial task in aerial image processin...
research
02/03/2021

Do Not Forget to Attend to Uncertainty while Mitigating Catastrophic Forgetting

One of the major limitations of deep learning models is that they face c...
research
10/04/2021

Incremental Class Learning using Variational Autoencoders with Similarity Learning

Catastrophic forgetting in neural networks during incremental learning r...
research
02/01/2022

Continual Attentive Fusion for Incremental Learning in Semantic Segmentation

Over the past years, semantic segmentation, as many other tasks in compu...
research
04/02/2022

Class-Incremental Learning by Knowledge Distillation with Adaptive Feature Consolidation

We present a novel class incremental learning approach based on deep neu...
research
05/01/2023

Refined Response Distillation for Class-Incremental Player Detection

Detecting players from sports broadcast videos is essential for intellig...
research
02/12/2023

SCLIFD:Supervised Contrastive Knowledge Distillation for Incremental Fault Diagnosis under Limited Fault Data

Intelligent fault diagnosis has made extraordinary advancements currentl...

Please sign up or login with your details

Forgot password? Click here to reset