Object detection at 200 Frames Per Second

05/16/2018
by   Rakesh Mehta, et al.
0

In this paper, we propose an efficient and fast object detector which can process hundreds of frames per second. To achieve this goal we investigate three main aspects of the object detection framework: network architecture, loss function and training data (labeled and unlabeled). In order to obtain compact network architecture, we introduce various improvements, based on recent work, to develop an architecture which is computationally light-weight and achieves a reasonable performance. To further improve the performance, while keeping the complexity same, we utilize distillation loss function. Using distillation loss we transfer the knowledge of a more accurate teacher network to proposed light-weight student network. We propose various innovations to make distillation efficient for the proposed one stage detector pipeline: objectness scaled distillation loss, feature map non-maximal suppression and a single unified distillation loss function for detection. Finally, building upon the distillation loss, we explore how much can we push the performance by utilizing the unlabeled data. We train our model with unlabeled data using the soft labels of the teacher network. Our final network consists of 10x fewer parameters than the VGG based object detection network and it achieves a speed of more than 200 FPS and proposed changes improve the detection accuracy by 14 mAP over the baseline on Pascal dataset.

READ FULL TEXT

page 3

page 8

research
05/22/2021

Revisiting Knowledge Distillation for Object Detection

The existing solutions for object detection distillation rely on the ava...
research
04/03/2019

A Comprehensive Overhaul of Feature Distillation

We investigate the design aspects of feature distillation methods achiev...
research
03/09/2023

Smooth and Stepwise Self-Distillation for Object Detection

Distilling the structured information captured in feature maps has contr...
research
07/05/2022

PKD: General Distillation Framework for Object Detectors via Pearson Correlation Coefficient

Knowledge distillation(KD) is a widely-used technique to train compact m...
research
08/20/2022

Effectiveness of Function Matching in Driving Scene Recognition

Knowledge distillation is an effective approach for training compact rec...
research
04/16/2018

Towards High Performance Video Object Detection for Mobiles

Despite the recent success of video object detection on Desktop GPUs, it...
research
04/04/2022

Re-examining Distillation For Continual Object Detection

Training models continually to detect and classify objects, from new cla...

Please sign up or login with your details

Forgot password? Click here to reset