Knowledge Distillation approach towards Melanoma Detection

10/14/2022
by   Md Shakib Khan, et al.
16

Melanoma is regarded as the most threatening among all skin cancers. There is a pressing need to build systems which can aid in the early detection of melanoma and enable timely treatment to patients. Recent methods are geared towards machine learning based systems where the task is posed as image recognition, tag dermoscopic images of skin lesions as melanoma or non-melanoma. Even though these methods show promising results in terms of accuracy, they are computationally quite expensive to train, that questions the ability of these models to be deployable in a clinical setting or memory constraint devices. To address this issue, we focus on building simple and performant models having few layers, less than ten compared to hundreds. As well as with fewer learnable parameters, 0.26 million (M) compared to 42.5M using knowledge distillation with the goal to detect melanoma from dermoscopic images. First, we train a teacher model using a ResNet-50 to detect melanoma. Using the teacher model, we train the student model known as Distilled Student Network (DSNet) which has around 0.26M parameters using knowledge distillation achieving an accuracy of 91.7 such MobileNet, VGG-16, Inception-V3, EfficientNet-B0, ResNet-50 and ResNet-101. We find that our approach works well in terms of inference runtime compared to other pre-trained models, 2.57 seconds compared to 14.55 seconds. We find that DSNet (0.26M parameters), which is 15 times smaller, consistently performs better than EfficientNet-B0 (4M parameters) in both melanoma and non-melanoma detection across Precision, Recall and F1 scores

READ FULL TEXT

page 5

page 7

page 10

research
10/09/2021

Visualizing the embedding space to explain the effect of knowledge distillation

Recent research has found that knowledge distillation can be effective i...
research
10/22/2021

How and When Adversarial Robustness Transfers in Knowledge Distillation?

Knowledge distillation (KD) has been widely used in teacher-student trai...
research
01/27/2022

Dynamic Rectification Knowledge Distillation

Knowledge Distillation is a technique which aims to utilize dark knowled...
research
06/07/2023

Faithful Knowledge Distillation

Knowledge distillation (KD) has received much attention due to its succe...
research
11/09/2020

Knowledge Distillation for Singing Voice Detection

Singing Voice Detection (SVD) has been an active area of research in mus...
research
06/16/2023

Squeezing nnU-Nets with Knowledge Distillation for On-Board Cloud Detection

Cloud detection is a pivotal satellite image pre-processing step that ca...
research
05/27/2023

Vision Transformers for Small Histological Datasets Learned through Knowledge Distillation

Computational Pathology (CPATH) systems have the potential to automate d...

Please sign up or login with your details

Forgot password? Click here to reset