Resolution-Based Distillation for Efficient Histology Image Classification

01/11/2021
by   Joseph DiPalma, et al.
0

Developing deep learning models to analyze histology images has been computationally challenging, as the massive size of the images causes excessive strain on all parts of the computing pipeline. This paper proposes a novel deep learning-based methodology for improving the computational efficiency of histology image classification. The proposed approach is robust when used with images that have reduced input resolution and can be trained effectively with limited labeled data. Pre-trained on the original high-resolution (HR) images, our method uses knowledge distillation (KD) to transfer learned knowledge from a teacher model to a student model trained on the same images at a much lower resolution. To address the lack of large-scale labeled histology image datasets, we perform KD in a self-supervised manner. We evaluate our approach on two histology image datasets associated with celiac disease (CD) and lung adenocarcinoma (LUAD). Our results show that a combination of KD and self-supervision allows the student model to approach, and in some cases, surpass the classification accuracy of the teacher, while being much more efficient. Additionally, we observe an increase in student classification performance as the size of the unlabeled dataset increases, indicating that there is potential to scale further. For the CD data, our model outperforms the HR teacher model, while needing 4 times fewer computations. For the LUAD data, our student model results at 1.25x magnification are within 3 model at 10x magnification, with a 64 times computational cost reduction. Moreover, our CD outcomes benefit from performance scaling with the use of more unlabeled data. For 0.625x magnification, using unlabeled data improves accuracy by 4 of deep learning solutions for digital pathology with standard computational hardware.

READ FULL TEXT
research
04/01/2022

Unified and Effective Ensemble Knowledge Distillation

Ensemble knowledge distillation can extract knowledge from multiple teac...
research
04/15/2023

Teacher Network Calibration Improves Cross-Quality Knowledge Distillation

We investigate cross-quality knowledge distillation (CQKD), a knowledge ...
research
01/17/2022

Distillation from heterogeneous unlabeled collections

Compressing deep networks is essential to expand their range of applicat...
research
01/18/2022

Deep Cervix Model Development from Heterogeneous and Partially Labeled Image Datasets

Cervical cancer is the fourth most common cancer in women worldwide. The...
research
06/03/2020

Image Classification in the Dark using Quanta Image Sensors

State-of-the-art image classifiers are trained and tested using well-ill...
research
03/18/2020

Fixing the train-test resolution discrepancy: FixEfficientNet

This note complements the paper "Fixing the train-test resolution discre...
research
08/20/2022

Effectiveness of Function Matching in Driving Scene Recognition

Knowledge distillation is an effective approach for training compact rec...

Please sign up or login with your details

Forgot password? Click here to reset