Guided Hybrid Quantization for Object detection in Multimodal Remote Sensing Imagery via One-to-one Self-teaching

12/31/2022
by   Jiaqing Zhang, et al.
0

Considering the computation complexity, we propose a Guided Hybrid Quantization with One-to-one Self-Teaching (GHOST) framework. More concretely, we first design a structure called guided quantization self-distillation (GQSD), which is an innovative idea for realizing lightweight through the synergy of quantization and distillation. The training process of the quantization model is guided by its full-precision model, which is time-saving and cost-saving without preparing a huge pre-trained model in advance. Second, we put forward a hybrid quantization (HQ) module to obtain the optimal bit width automatically under a constrained condition where a threshold for distribution distance between the center and samples is applied in the weight value search space. Third, in order to improve information transformation, we propose a one-to-one self-teaching (OST) module to give the student network a ability of self-judgment. A switch control machine (SCM) builds a bridge between the student network and teacher network in the same location to help the teacher to reduce wrong guidance and impart vital knowledge to the student. This distillation method allows a model to learn from itself and gain substantial improvement without any additional supervision. Extensive experiments on a multimodal dataset (VEDAI) and single-modality datasets (DOTA, NWPU, and DIOR) show that object detection based on GHOST outperforms the existing detectors. The tiny parameters (<9.7 MB) and Bit-Operations (BOPs) (<2158 G) compared with any remote sensing-based, lightweight or distillation-based algorithms demonstrate the superiority in the lightweight design domain. Our code and model will be released at https://github.com/icey-zhang/GHOST.

READ FULL TEXT

page 1

page 3

page 5

page 9

research
06/17/2021

Dynamic Knowledge Distillation with A Single Stream Structure for RGB-DSalient Object Detection

RGB-D salient object detection(SOD) demonstrates its superiority on dete...
research
01/31/2023

AMD: Adaptive Masked Distillation for Object

As a general model compression paradigm, feature-based knowledge distill...
research
09/23/2021

LGD: Label-guided Self-distillation for Object Detection

In this paper, we propose the first self-distillation framework for gene...
research
05/30/2022

Towards Efficient 3D Object Detection with Knowledge Distillation

Despite substantial progress in 3D object detection, advanced 3D detecto...
research
10/07/2022

IDa-Det: An Information Discrepancy-aware Distillation for 1-bit Detectors

Knowledge distillation (KD) has been proven to be useful for training co...
research
08/23/2020

Matching Guided Distillation

Feature distillation is an effective way to improve the performance for ...

Please sign up or login with your details

Forgot password? Click here to reset