Dynamic Y-KD: A Hybrid Approach to Continual Instance Segmentation

03/10/2023
by   Mathieu Pagé Fortin, et al.
0

Despite the success of deep learning methods on instance segmentation, these models still suffer from catastrophic forgetting in continual learning scenarios. In this paper, our contributions for continual instance segmentation are threefold. First, we propose the Y-knowledge distillation (Y-KD), a knowledge distillation strategy that shares a common feature extractor between the teacher and student networks. As the teacher is also updated with new data in Y-KD, the increased plasticity results in new modules that are specialized on new classes. Second, our Y-KD approach is supported by a dynamic architecture method that grows new modules for each task and uses all of them for inference with a unique instance segmentation head, which significantly reduces forgetting. Third, we complete our approach by leveraging checkpoint averaging as a simple method to manually balance the trade-off between the performance on the various sets of classes, thus increasing the control over the model's behavior without any additional cost. These contributions are united in our model that we name the Dynamic Y-KD network. We perform extensive experiments on several single-step and multi-steps scenarios on Pascal-VOC, and we show that our approach outperforms previous methods both on past and new classes. For instance, compared to recent work, our method obtains +2.1 in 19-1 and reaches 91.5 in 15-5.

READ FULL TEXT
research
02/15/2023

Offline-to-Online Knowledge Distillation for Video Instance Segmentation

In this paper, we present offline-to-online knowledge distillation (OOKD...
research
01/18/2023

Adaptively Integrated Knowledge Distillation and Prediction Uncertainty for Continual Learning

Current deep learning models often suffer from catastrophic forgetting o...
research
01/13/2022

Technical Report for ICCV 2021 Challenge SSLAD-Track3B: Transformers Are Better Continual Learners

In the SSLAD-Track 3B challenge on continual learning, we propose the me...
research
07/21/2020

Deep Semi-supervised Knowledge Distillation for Overlapping Cervical Cell Instance Segmentation

Deep learning methods show promising results for overlapping cervical ce...
research
05/25/2023

Fairness Continual Learning Approach to Semantic Scene Understanding in Open-World Environments

Continual semantic segmentation aims to learn new classes while maintain...
research
08/18/2023

Adapt Your Teacher: Improving Knowledge Distillation for Exemplar-free Continual Learning

In this work, we investigate exemplar-free class incremental learning (C...
research
06/14/2023

Heterogeneous Continual Learning

We propose a novel framework and a solution to tackle the continual lear...

Please sign up or login with your details

Forgot password? Click here to reset