A Survey of the Impact of Self-Supervised Pretraining for Diagnostic Tasks with Radiological Images

09/05/2023
by   Blake VanBerlo, et al.
0

Self-supervised pretraining has been observed to be effective at improving feature representations for transfer learning, leveraging large amounts of unlabelled data. This review summarizes recent research into its usage in X-ray, computed tomography, magnetic resonance, and ultrasound imaging, concentrating on studies that compare self-supervised pretraining to fully supervised learning for diagnostic tasks such as classification and segmentation. The most pertinent finding is that self-supervised pretraining generally improves downstream task performance compared to full supervision, most prominently when unlabelled examples greatly outnumber labelled examples. Based on the aggregate evidence, recommendations are provided for practitioners considering using self-supervised learning. Motivated by limitations identified in current research, directions and practices for future study are suggested, such as integrating clinical knowledge with theoretically justified self-supervised learning methods, evaluating on public datasets, growing the modest body of evidence for ultrasound, and characterizing the impact of self-supervised pretraining on generalization.

READ FULL TEXT

page 5

page 6

research
03/30/2020

Improving out-of-distribution generalization via multi-task self-supervised pretraining

Self-supervised feature representations have been shown to be useful for...
research
04/05/2023

Exploring the Utility of Self-Supervised Pretraining Strategies for the Detection of Absent Lung Sliding in M-Mode Lung Ultrasound

Self-supervised pretraining has been observed to improve performance in ...
research
09/05/2023

Self-Supervised Pretraining Improves Performance and Inference Efficiency in Multiple Lung Ultrasound Interpretation Tasks

In this study, we investigated whether self-supervised pretraining could...
research
10/23/2022

Adversarial Pretraining of Self-Supervised Deep Networks: Past, Present and Future

In this paper, we review adversarial pretraining of self-supervised deep...
research
08/29/2022

SB-SSL: Slice-Based Self-Supervised Transformers for Knee Abnormality Classification from MRI

The availability of large scale data with high quality ground truth labe...
research
06/23/2023

Variance-Covariance Regularization Improves Representation Learning

Transfer learning has emerged as a key approach in the machine learning ...
research
10/04/2022

Improving Label-Deficient Keyword Spotting Using Self-Supervised Pretraining

In recent years, the development of accurate deep keyword spotting (KWS)...

Please sign up or login with your details

Forgot password? Click here to reset