Can we Adopt Self-supervised Pretraining for Chest X-Rays?

11/23/2022
by   Arsh Verma, et al.
0

Chest radiograph (or Chest X-Ray, CXR) is a popular medical imaging modality that is used by radiologists across the world to diagnose heart or lung conditions. Over the last decade, Convolutional Neural Networks (CNN), have seen success in identifying pathologies in CXR images. Typically, these CNNs are pretrained on the standard ImageNet classification task, but this assumes availability of large-scale annotated datasets. In this work, we analyze the utility of pretraining on unlabeled ImageNet or Chest X-Ray (CXR) datasets using various algorithms and in multiple settings. Some findings of our work include: (i) supervised training with labeled ImageNet learns strong representations that are hard to beat; (ii) self-supervised pretraining on ImageNet ( 1M images) shows performance similar to self-supervised pretraining on a CXR dataset ( 100K images); and (iii) the CNN trained on supervised ImageNet can be trained further with self-supervised CXR images leading to improvements, especially when the downstream dataset is on the order of a few thousand images.

READ FULL TEXT
research
01/25/2023

Self-Supervised Curricular Deep Learning for Chest X-Ray Image Classification

Deep learning technologies have already demonstrated a high potential to...
research
01/18/2021

CheXtransfer: Performance and Parameter Efficiency of ImageNet Models for Chest X-Ray Interpretation

Deep learning methods for chest X-ray interpretation typically rely on p...
research
03/04/2021

Self-supervised deep convolutional neural network for chest X-ray classification

Chest radiography is a relatively cheap, widely available medical proced...
research
03/23/2021

Self-Supervised Pretraining Improves Self-Supervised Pretraining

While self-supervised pretraining has proven beneficial for many compute...
research
06/07/2019

Selfie: Self-supervised Pretraining for Image Embedding

We introduce a pretraining technique called Selfie, which stands for SEL...
research
09/07/2022

Prior Knowledge-Guided Attention in Self-Supervised Vision Transformers

Recent trends in self-supervised representation learning have focused on...
research
06/25/2021

On the Robustness of Pretraining and Self-Supervision for a Deep Learning-based Analysis of Diabetic Retinopathy

There is an increasing number of medical use-cases where classification ...

Please sign up or login with your details

Forgot password? Click here to reset