Self-supervised driven consistency training for annotation efficient histopathology image analysis

02/07/2021
by   Chetan L. Srinidhi, et al.
0

Training a neural network with a large labeled dataset is still a dominant paradigm in computational histopathology. However, obtaining such exhaustive manual annotations is often expensive, laborious, and prone to inter and Intra-observer variability. While recent self-supervised and semi-supervised methods can alleviate this need by learn-ing unsupervised feature representations, they still struggle to generalize well to downstream tasks when the number of labeled instances is small. In this work, we overcome this challenge by leveraging both task-agnostic and task-specific unlabeled data based on two novel strategies: i) a self-supervised pretext task that harnesses the underlying multi-resolution contextual cues in histology whole-slide images to learn a powerful supervisory signal for unsupervised representation learning; ii) a new teacher-student semi-supervised consistency paradigm that learns to effectively transfer the pretrained representations to downstream tasks based on prediction consistency with the task-specific un-labeled data. We carry out extensive validation experiments on three histopathology benchmark datasets across two classification and one regression-based tasks, i.e., tumor metastasis detection, tissue type classification, and tumor cellularity quantification. Under limited-label data, the proposed method yields tangible improvements, which is close or even outperforming other state-of-the-art self-supervised and supervised baselines. Furthermore, we empirically show that the idea of bootstrapping the self-supervised pretrained features is an effective way to improve the task-specific semi-supervised learning on standard benchmarks. Code and pretrained models will be made available at: https://github.com/srinidhiPY/SSL_CR_Histo

READ FULL TEXT

page 19

page 20

research
08/12/2020

Self-Path: Self-supervision for Classification of Pathology Images with Limited Annotations

While high-resolution pathology images lend themselves well to `data hun...
research
06/25/2019

Semi-Supervised Learning with Self-Supervised Networks

Recent advances in semi-supervised learning have shown tremendous potent...
research
08/16/2021

Improving Self-supervised Learning with Hardness-aware Dynamic Curriculum Learning: An Application to Digital Pathology

Self-supervised learning (SSL) has recently shown tremendous potential t...
research
07/13/2022

DSPNet: Towards Slimmable Pretrained Networks based on Discriminative Self-supervised Learning

Self-supervised learning (SSL) has achieved promising downstream perform...
research
02/12/2023

Policy-Induced Self-Supervision Improves Representation Finetuning in Visual RL

We study how to transfer representations pretrained on source tasks to t...
research
11/07/2022

Okapi: Generalising Better by Making Statistical Matches Match

We propose Okapi, a simple, efficient, and general method for robust sem...
research
11/28/2022

Deep Semi-supervised Learning with Double-Contrast of Features and Semantics

In recent years, the field of intelligent transportation systems (ITS) h...

Please sign up or login with your details

Forgot password? Click here to reset