Knowledge Distillation from Single to Multi Labels: an Empirical Study

03/15/2023
by   Youcai Zhang, et al.
0

Knowledge distillation (KD) has been extensively studied in single-label image classification. However, its efficacy for multi-label classification remains relatively unexplored. In this study, we firstly investigate the effectiveness of classical KD techniques, including logit-based and feature-based methods, for multi-label classification. Our findings indicate that the logit-based method is not well-suited for multi-label classification, as the teacher fails to provide inter-category similarity information or regularization effect on student model's training. Moreover, we observe that feature-based methods struggle to convey compact information of multiple labels simultaneously. Given these limitations, we propose that a suitable dark knowledge should incorporate class-wise information and be highly correlated with the final classification results. To address these issues, we introduce a novel distillation method based on Class Activation Maps (CAMs), which is both effective and straightforward to implement. Across a wide range of settings, CAMs-based distillation consistently outperforms other methods.

READ FULL TEXT

page 3

page 7

research
08/12/2023

Multi-Label Knowledge Distillation

Existing knowledge distillation methods typically work by imparting the ...
research
09/16/2018

Multi-Label Image Classification via Knowledge Distillation from Weakly-Supervised Detection

Multi-label image classification is a fundamental but challenging task t...
research
05/10/2023

Explainable Knowledge Distillation for On-device Chest X-Ray Classification

Automated multi-label chest X-rays (CXR) image classification has achiev...
research
05/23/2022

Boosting Multi-Label Image Classification with Complementary Parallel Self-Distillation

Multi-Label Image Classification (MLIC) approaches usually exploit label...
research
04/01/2021

Is Label Smoothing Truly Incompatible with Knowledge Distillation: An Empirical Study

This work aims to empirically clarify a recently discovered perspective ...
research
02/14/2021

Comprehensive Comparative Study of Multi-Label Classification Methods

Multi-label classification (MLC) has recently received increasing intere...
research
08/01/2023

NormKD: Normalized Logits for Knowledge Distillation

Logit based knowledge distillation gets less attention in recent years s...

Please sign up or login with your details

Forgot password? Click here to reset