Re-examining Distillation For Continual Object Detection

04/04/2022
by   Eli Verwimp, et al.
0

Training models continually to detect and classify objects, from new classes and new domains, remains an open problem. In this work, we conduct a thorough analysis of why and how object detection models forget catastrophically. We focus on distillation-based approaches in two-stage networks; the most-common strategy employed in contemporary continual object detection work.Distillation aims to transfer the knowledge of a model trained on previous tasks – the teacher – to a new model – the student – while it learns the new task. We show that this works well for the region proposal network, but that wrong, yet overly confident teacher predictions prevent student models from effective learning of the classification head. Our analysis provides a foundation that allows us to propose improvements for existing techniques by detecting incorrect teacher predictions, based on current ground-truth labels, and by employing an adaptive Huber loss as opposed to the mean squared error for the distillation loss in the classification heads. We evidence that our strategy works not only in a class incremental setting, but also in domain incremental settings, which constitute a realistic context, likely to be the setting of representative real-world problems.

READ FULL TEXT

page 5

page 6

page 13

page 14

page 18

research
05/22/2021

Revisiting Knowledge Distillation for Object Detection

The existing solutions for object detection distillation rely on the ava...
research
09/08/2022

nVFNet-RDC: Replay and Non-Local Distillation Collaboration for Continual Object Detection

Continual Learning (CL) focuses on developing algorithms with the abilit...
research
04/06/2023

Continual Detection Transformer for Incremental Object Detection

Incremental object detection (IOD) aims to train an object detector in p...
research
06/23/2020

Distilling Object Detectors with Task Adaptive Regularization

Current state-of-the-art object detectors are at the expense of high com...
research
08/18/2023

Adapt Your Teacher: Improving Knowledge Distillation for Exemplar-free Continual Learning

In this work, we investigate exemplar-free class incremental learning (C...
research
05/16/2018

Object detection at 200 Frames Per Second

In this paper, we propose an efficient and fast object detector which ca...
research
05/06/2022

Continual Object Detection via Prototypical Task Correlation Guided Gating Mechanism

Continual learning is a challenging real-world problem for constructing ...

Please sign up or login with your details

Forgot password? Click here to reset