FoPro-KD: Fourier Prompted Effective Knowledge Distillation for Long-Tailed Medical Image Recognition

05/27/2023
by   Marawan Elbatel, et al.
8

Transfer learning is a promising technique for medical image classification, particularly for long-tailed datasets. However, the scarcity of data in medical imaging domains often leads to overparameterization when fine-tuning large publicly available pre-trained models. Moreover, these large models are ineffective in deployment in clinical settings due to their computational expenses. To address these challenges, we propose FoPro-KD, a novel approach that unleashes the power of frequency patterns learned from frozen publicly available pre-trained models to enhance their transferability and compression. FoPro-KD comprises three modules: Fourier prompt generator (FPG), effective knowledge distillation (EKD), and adversarial knowledge distillation (AKD). The FPG module learns to generate targeted perturbations conditional on a target dataset, exploring the representations of a frozen pre-trained model, trained on natural images. The EKD module exploits these generalizable representations through distillation to a smaller target model, while the AKD module further enhances the distillation process. Through these modules, FoPro-KD achieves significant improvements in performance on long-tailed medical image classification benchmarks, demonstrating the potential of leveraging the learned frequency patterns from pre-trained models to enhance transfer learning and compression of large pre-trained models for feasible deployment.

READ FULL TEXT

page 5

page 11

research
09/01/2020

Classification of Diabetic Retinopathy Using Unlabeled Data and Knowledge Distillation

Knowledge distillation allows transferring knowledge from a pre-trained ...
research
03/23/2023

Exploring Visual Prompts for Whole Slide Image Classification with Multiple Instance Learning

Multiple instance learning (MIL) has emerged as a popular method for cla...
research
04/25/2023

Bayesian Optimization Meets Self-Distillation

Bayesian optimization (BO) has contributed greatly to improving model pe...
research
02/09/2020

Unlabeled Data Deployment for Classification of Diabetic Retinopathy Images Using Knowledge Transfer

Convolutional neural networks (CNNs) are extensively beneficial for medi...
research
08/06/2020

MED-TEX: Transferring and Explaining Knowledge with Less Data from Pretrained Medical Imaging Models

Deep neural network based image classification methods usually require a...
research
11/29/2021

Enhanced Transfer Learning Through Medical Imaging and Patient Demographic Data Fusion

In this work we examine the performance enhancement in classification of...
research
03/24/2022

Ensembling and Knowledge Distilling of Large Sequence Taggers for Grammatical Error Correction

In this paper, we investigate improvements to the GEC sequence tagging a...

Please sign up or login with your details

Forgot password? Click here to reset