Temporal Self-Ensembling Teacher for Semi-Supervised Object Detection

07/13/2020
by   Cong Chen, et al.
11

This paper focuses on the problem of Semi-Supervised Object Detection (SSOD). In the field of Semi-Supervised Learning (SSL), the Knowledge Distillation (KD) framework which consists of a teacher model and a student model is widely used to make good use of the unlabeled images. Given unlabeled images, the teacher is supposed to yield meaningful targets (e.g. well-posed logits) to regularize the training of the student. However, directly applying the KD framework in SSOD has the following obstacles. (1) Teacher and student predictions may be very close which limits the upper-bound of the student, and (2) the data imbalance dilemma caused by dense prediction from object detection hinders an efficient consistency regularization between the teacher and student. To solve these problems, we propose the Temporal Self-Ensembling Teacher (TSE-T) model on top of the KD framework. Differently from the conventional KD methods, we devise a temporally updated teacher model. First, our teacher model ensembles its temporal predictions for unlabeled images under varying perturbations. Second, our teacher model ensembles its temporal model weights by Exponential Moving Average (EMA) which allows it gradually learn from student. The above self-ensembling strategies collaboratively yield better teacher predictions for unblabeled images. Finally, we use focal loss to formulate the consistency regularization to handle the data imbalance problem. Evaluated on the widely used VOC and COCO benchmarks, our method has achieved 80.73 on the VOC2007 test set and the COCO2012 test-dev set respectively, which outperforms the fully-supervised detector by 2.37 method sets the new state state of the art in SSOD on VOC benchmark which outperforms the baseline SSOD method by 1.44 publicly available at <http://github.com/SYangDong/tse-t.>

READ FULL TEXT

page 1

page 2

page 4

page 10

research
01/02/2019

Learning Efficient Detector with Semi-supervised Adaptive Distillation

Knowledge Distillation (KD) has been used in image classification for mo...
research
06/19/2021

Humble Teachers Teach Better Students for Semi-Supervised Object Detection

We propose a semi-supervised approach for contemporary object detectors ...
research
09/03/2019

Dual Student: Breaking the Limits of the Teacher in Semi-supervised Learning

Recently, consistency-based methods have achieved state-of-the-art resul...
research
10/04/2021

Spatial Ensemble: a Novel Model Smoothing Mechanism for Student-Teacher Framework

Model smoothing is of central importance for obtaining a reliable teache...
research
01/17/2019

Certainty-Driven Consistency Loss for Semi-supervised Learning

The recently proposed semi-supervised learning methods exploit consisten...
research
04/20/2021

SE-SSD: Self-Ensembling Single-Stage Object Detector From Point Cloud

We present Self-Ensembling Single-Stage object Detector (SE-SSD) for acc...
research
07/19/2020

Self-similarity Student for Partial Label Histopathology Image Segmentation

Delineation of cancerous regions in gigapixel whole slide images (WSIs) ...

Please sign up or login with your details

Forgot password? Click here to reset