Explainable Knowledge Distillation for On-device Chest X-Ray Classification

05/10/2023
by   Chakkrit Termritthikun, et al.
0

Automated multi-label chest X-rays (CXR) image classification has achieved substantial progress in clinical diagnosis via utilizing sophisticated deep learning approaches. However, most deep models have high computational demands, which makes them less feasible for compact devices with low computational requirements. To overcome this problem, we propose a knowledge distillation (KD) strategy to create the compact deep learning model for the real-time multi-label CXR image classification. We study different alternatives of CNNs and Transforms as the teacher to distill the knowledge to a smaller student. Then, we employed explainable artificial intelligence (XAI) to provide the visual explanation for the model decision improved by the KD. Our results on three benchmark CXR datasets show that our KD strategy provides the improved performance on the compact student model, thus being the feasible choice for many limited hardware platforms. For instance, when using DenseNet161 as the teacher network, EEEA-Net-C2 achieved an AUC of 83.7 ChestX-ray14, CheXpert, and PadChest datasets, respectively, with fewer parameters of 4.7 million and computational cost of 0.3 billion FLOPS.

READ FULL TEXT

page 4

page 8

page 9

page 11

research
09/16/2018

Multi-Label Image Classification via Knowledge Distillation from Weakly-Supervised Detection

Multi-label image classification is a fundamental but challenging task t...
research
03/15/2023

Knowledge Distillation from Single to Multi Labels: an Empirical Study

Knowledge distillation (KD) has been extensively studied in single-label...
research
03/16/2023

Knowledge Distillation for Adaptive MRI Prostate Segmentation Based on Limit-Trained Multi-Teacher Models

With numerous medical tasks, the performance of deep models has recently...
research
02/25/2023

A Light-weight Deep Learning Model for Remote Sensing Image Classification

In this paper, we present a high-performance and light-weight deep learn...
research
05/23/2022

Boosting Multi-Label Image Classification with Complementary Parallel Self-Distillation

Multi-Label Image Classification (MLIC) approaches usually exploit label...
research
11/19/2018

Self-Referenced Deep Learning

Knowledge distillation is an effective approach to transferring knowledg...
research
01/06/2023

Designing an Improved Deep Learning-based Model for COVID-19 Recognition in Chest X-ray Images: A Knowledge Distillation Approach

COVID-19 has adversely affected humans and societies in different aspect...

Please sign up or login with your details

Forgot password? Click here to reset