Categorical Relation-Preserving Contrastive Knowledge Distillation for Medical Image Classification

07/07/2021
by   Xiaohan Xing, et al.
0

The amount of medical images for training deep classification models is typically very scarce, making these deep models prone to overfit the training data. Studies showed that knowledge distillation (KD), especially the mean-teacher framework which is more robust to perturbations, can help mitigate the over-fitting effect. However, directly transferring KD from computer vision to medical image classification yields inferior performance as medical images suffer from higher intra-class variance and class imbalance. To address these issues, we propose a novel Categorical Relation-preserving Contrastive Knowledge Distillation (CRCKD) algorithm, which takes the commonly used mean-teacher model as the supervisor. Specifically, we propose a novel Class-guided Contrastive Distillation (CCD) module to pull closer positive image pairs from the same class in the teacher and student models, while pushing apart negative image pairs from different classes. With this regularization, the feature distribution of the student model shows higher intra-class similarity and inter-class variance. Besides, we propose a Categorical Relation Preserving (CRP) loss to distill the teacher's relational knowledge in a robust and class-balanced manner. With the contribution of the CCD and CRP, our CRCKD algorithm can distill the relational knowledge more comprehensively. Extensive experiments on the HAM10000 and APTOS datasets demonstrate the superiority of the proposed CRCKD method.

READ FULL TEXT
research
08/06/2020

MED-TEX: Transferring and Explaining Knowledge with Less Data from Pretrained Medical Imaging Models

Deep neural network based image classification methods usually require a...
research
04/28/2023

CORSD: Class-Oriented Relational Self Distillation

Knowledge distillation conducts an effective model compression method wh...
research
07/27/2023

Contrastive Knowledge Amalgamation for Unsupervised Image Classification

Knowledge amalgamation (KA) aims to learn a compact student model to han...
research
07/26/2022

Robust and Efficient Segmentation of Cross-domain Medical Images

Efficient medical image segmentation aims to provide accurate pixel-wise...
research
06/24/2020

Learning Interclass Relations for Image Classification

In standard classification, we typically treat class categories as indep...
research
08/17/2023

Learning Through Guidance: Knowledge Distillation for Endoscopic Image Classification

Endoscopy plays a major role in identifying any underlying abnormalities...
research
03/22/2022

SSD-KD: A Self-supervised Diverse Knowledge Distillation Method for Lightweight Skin Lesion Classification Using Dermoscopic Images

Skin cancer is one of the most common types of malignancy, affecting a l...

Please sign up or login with your details

Forgot password? Click here to reset