Teacher-Student chain for efficient semi-supervised histology image classification

03/17/2020
by   Shayne Shaw, et al.
0

Deep learning shows great potential for the domain of digital pathology. An automated digital pathology system could serve as a second reader, perform initial triage in large screening studies, or assist in reporting. However, it is expensive to exhaustively annotate large histology image databases, since medical specialists are a scarce resource. In this paper, we apply the semi-supervised teacher-student knowledge distillation technique proposed by Yalniz et al. (2019) to the task of quantifying prognostic features in colorectal cancer. We obtain accuracy improvements through extending this approach to a chain of students, where each student's predictions are used to train the next student i.e. the student becomes the teacher. Using the chain approach, and only 0.5 pool), we match the accuracy of training on 100 percentages of labelled data, similar gains in accuracy are seen, allowing some recovery of accuracy even from a poor initial choice of labelled training set. In conclusion, this approach shows promise for reducing the annotation burden, thus increasing the affordability of automated digital pathology systems.

READ FULL TEXT
research
08/09/2023

JEDI: Joint Expert Distillation in a Semi-Supervised Multi-Dataset Student-Teacher Scenario for Video Action Recognition

We propose JEDI, a multi-dataset semi-supervised learning method, which ...
research
01/02/2019

Learning Efficient Detector with Semi-supervised Adaptive Distillation

Knowledge Distillation (KD) has been used in image classification for mo...
research
07/05/2022

ACT-Net: Asymmetric Co-Teacher Network for Semi-supervised Memory-efficient Medical Image Segmentation

While deep models have shown promising performance in medical image segm...
research
11/29/2021

Semi-supervised Domain Adaptation via Sample-to-Sample Self-Distillation

Semi-supervised domain adaptation (SSDA) is to adapt a learner to a new ...
research
06/14/2021

Kaizen: Continuously improving teacher using Exponential Moving Average for semi-supervised speech recognition

In this paper, we introduce the Kaizen framework that uses a continuousl...
research
08/10/2020

Knowledge Distillation and Data Selection for Semi-Supervised Learning in CTC Acoustic Models

Semi-supervised learning (SSL) is an active area of research which aims ...
research
04/08/2021

GKD: Semi-supervised Graph Knowledge Distillation for Graph-Independent Inference

The increased amount of multi-modal medical data has opened the opportun...

Please sign up or login with your details

Forgot password? Click here to reset