GAN-Knowledge Distillation for one-stage Object Detection

06/20/2019
by   Wei Hong, et al.
0

Convolutional neural networks have a significant improvement in the accuracy of target detection. As convolutional neural networks become deeper, the accuracy of detection is also obviously improved, and more floating-point calculations are needed. Many researchers use the knowledge distillation method to improve the accuracy of student networks by transferring knowledge from a deeper and larger teachers network to a small student network, in object detection. Most methods of knowledge distillation need to designed complex cost functions and they are aimed at the two-stage object detection algorithm. This paper proposes a clean and effective knowledge distillation method for the one-stage object detection. The feature maps generated by teacher network and student network are used as true samples and fake samples respectively, and generate adversarial training for both to improve the performance of the student network in one-stage object detection.

READ FULL TEXT
research
09/16/2021

Label Assignment Distillation for Object Detection

Knowledge distillation methods are proved to be promising in improving t...
research
06/20/2022

Knowledge Distillation for Oriented Object Detection on Aerial Images

Deep convolutional neural network with increased number of parameters ha...
research
11/23/2022

Structural Knowledge Distillation for Object Detection

Knowledge Distillation (KD) is a well-known training paradigm in deep ne...
research
06/02/2023

Group channel pruning and spatial attention distilling for object detection

Due to the over-parameterization of neural networks, many model compress...
research
11/14/2021

Robust and Accurate Object Detection via Self-Knowledge Distillation

Object detection has achieved promising performance on clean datasets, b...
research
08/26/2022

Disentangle and Remerge: Interventional Knowledge Distillation for Few-Shot Object Detection from A Conditional Causal Perspective

Few-shot learning models learn representations with limited human annota...
research
03/04/2019

TKD: Temporal Knowledge Distillation for Active Perception

Deep neural networks based methods have been proved to achieve outstandi...

Please sign up or login with your details

Forgot password? Click here to reset