MED-TEX: Transferring and Explaining Knowledge with Less Data from Pretrained Medical Imaging Models

08/06/2020
by   Thanh Nguyen-Duc, et al.
0

Deep neural network based image classification methods usually require a large amount of training data and lack interpretability, which are critical in the medical imaging domain. In this paper, we develop a novel knowledge distillation and model interpretation framework for medical image classification that jointly solves the above two issues. Specifically, to address the data-hungry issue, we propose to learn a small student model with less data by distilling knowledge only from a cumbersome pretrained teacher model. To interpret the teacher model as well as assisting the learning of the student, an explainer module is introduced to highlight the regions of an input medical image that are important for the predictions of the teacher model. Furthermore, the joint framework is trained by a principled way derived from the information-theoretic perspective. Our framework performance is demonstrated by the comprehensive experiments on the knowledge distillation and model interpretation tasks compared to state-of-the-art methods on a fundus disease dataset.

READ FULL TEXT

page 1

page 5

page 6

page 7

page 8

page 9

research
07/07/2021

Categorical Relation-Preserving Contrastive Knowledge Distillation for Medical Image Classification

The amount of medical images for training deep classification models is ...
research
10/15/2022

RoS-KD: A Robust Stochastic Knowledge Distillation Approach for Noisy Medical Imaging

AI-powered Medical Imaging has recently achieved enormous attention due ...
research
08/31/2020

Evaluating Knowledge Transfer In Neural Network for Medical Images

Deep learning and knowledge transfer techniques have permeated the field...
research
03/19/2021

Variational Knowledge Distillation for Disease Classification in Chest X-Rays

Disease classification relying solely on imaging data attracts great int...
research
04/07/2021

Distilling and Transferring Knowledge via cGAN-generated Samples for Image Classification and Regression

Knowledge distillation (KD) has been actively studied for image classifi...
research
08/23/2021

Efficient Medical Image Segmentation Based on Knowledge Distillation

Recent advances have been made in applying convolutional neural networks...
research
05/27/2023

FoPro-KD: Fourier Prompted Effective Knowledge Distillation for Long-Tailed Medical Image Recognition

Transfer learning is a promising technique for medical image classificat...

Please sign up or login with your details

Forgot password? Click here to reset