Deep Structured Instance Graph for Distilling Object Detectors

09/27/2021
by   Yixin Chen, et al.
0

Effectively structuring deep knowledge plays a pivotal role in transfer from teacher to student, especially in semantic vision tasks. In this paper, we present a simple knowledge structure to exploit and encode information inside the detection system to facilitate detector knowledge distillation. Specifically, aiming at solving the feature imbalance problem while further excavating the missing relation inside semantic instances, we design a graph whose nodes correspond to instance proposal-level features and edges represent the relation between nodes. To further refine this graph, we design an adaptive background loss weight to reduce node noise and background samples mining to prune trivial edges. We transfer the entire graph as encoded knowledge representation from teacher to student, capturing local and global information simultaneously. We achieve new state-of-the-art results on the challenging COCO object detection task with diverse student-teacher pairs on both one- and two-stage detectors. We also experiment with instance segmentation to demonstrate robustness of our method. It is notable that distilled Faster R-CNN with ResNet18-FPN and ResNet50-FPN yields 38.68 and 41.82 Box AP respectively on the COCO benchmark, Faster R-CNN with ResNet101-FPN significantly achieves 43.38 AP, which outperforms ResNet152-FPN teacher about 0.7 AP. Code: https://github.com/dvlab-research/Dsig.

READ FULL TEXT

page 1

page 8

research
10/17/2022

Distilling Object Detectors With Global Knowledge

Knowledge distillation learns a lightweight student model that mimics a ...
research
03/02/2022

SEA: Bridging the Gap Between One- and Two-stage Detector Distillation via SEmantic-aware Alignment

We revisit the one- and two-stage detector distillation tasks and presen...
research
08/24/2021

Improving Object Detection by Label Assignment Distillation

Label assignment in object detection aims to assign targets, foreground ...
research
04/19/2021

Distilling Knowledge via Knowledge Review

Knowledge distillation transfers knowledge from the teacher network to t...
research
02/11/2023

Dual Relation Knowledge Distillation for Object Detection

Knowledge distillation is an effective method for model compression. How...
research
10/07/2022

IDa-Det: An Information Discrepancy-aware Distillation for 1-bit Detectors

Knowledge distillation (KD) has been proven to be useful for training co...
research
04/20/2021

SE-SSD: Self-Ensembling Single-Stage Object Detector From Point Cloud

We present Self-Ensembling Single-Stage object Detector (SE-SSD) for acc...

Please sign up or login with your details

Forgot password? Click here to reset