Classification of Diabetic Retinopathy Using Unlabeled Data and Knowledge Distillation

09/01/2020
by   Sajjad Abbasi, et al.
0

Knowledge distillation allows transferring knowledge from a pre-trained model to another. However, it suffers from limitations, and constraints related to the two models need to be architecturally similar. Knowledge distillation addresses some of the shortcomings associated with transfer learning by generalizing a complex model to a lighter model. However, some parts of the knowledge may not be distilled by knowledge distillation sufficiently. In this paper, a novel knowledge distillation approach using transfer learning is proposed. The proposed method transfers the entire knowledge of a model to a new smaller one. To accomplish this, unlabeled data are used in an unsupervised manner to transfer the maximum amount of knowledge to the new slimmer model. The proposed method can be beneficial in medical image analysis, where labeled data are typically scarce. The proposed approach is evaluated in the context of classification of images for diagnosing Diabetic Retinopathy on two publicly available datasets, including Messidor and EyePACS. Simulation results demonstrate that the approach is effective in transferring knowledge from a complex model to a lighter one. Furthermore, experimental results illustrate that the performance of different small models is improved significantly using unlabeled data and knowledge distillation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/09/2020

Unlabeled Data Deployment for Classification of Diabetic Retinopathy Images Using Knowledge Transfer

Convolutional neural networks (CNNs) are extensively beneficial for medi...
research
05/27/2023

FoPro-KD: Fourier Prompted Effective Knowledge Distillation for Long-Tailed Medical Image Recognition

Transfer learning is a promising technique for medical image classificat...
research
02/17/2023

Explicit and Implicit Knowledge Distillation via Unlabeled Data

Data-free knowledge distillation is a challenging model lightweight task...
research
11/11/2020

Real-Time Decentralized knowledge Transfer at the Edge

Proliferation of edge networks creates islands of learning agents workin...
research
07/15/2023

SoccerKDNet: A Knowledge Distillation Framework for Action Recognition in Soccer Videos

Classifying player actions from soccer videos is a challenging problem, ...
research
09/14/2021

Building Accurate Simple Models with Multihop

Knowledge transfer from a complex high performing model to a simpler and...
research
03/19/2021

Variational Knowledge Distillation for Disease Classification in Chest X-Rays

Disease classification relying solely on imaging data attracts great int...

Please sign up or login with your details

Forgot password? Click here to reset