Self-Supervised Pretraining Improves Performance and Inference Efficiency in Multiple Lung Ultrasound Interpretation Tasks

09/05/2023
by   Blake VanBerlo, et al.
0

In this study, we investigated whether self-supervised pretraining could produce a neural network feature extractor applicable to multiple classification tasks in B-mode lung ultrasound analysis. When fine-tuning on three lung ultrasound tasks, pretrained models resulted in an improvement of the average across-task area under the receiver operating curve (AUC) by 0.032 and 0.061 on local and external test sets respectively. Compact nonlinear classifiers trained on features outputted by a single pretrained model did not improve performance across all tasks; however, they did reduce inference time by 49 training using 1 outperformed fully supervised models, with a maximum observed test AUC increase of 0.396 for the task of view classification. Overall, the results indicate that self-supervised pretraining is useful for producing initial weights for lung ultrasound classifiers.

READ FULL TEXT

page 2

page 4

page 5

research
04/05/2023

Exploring the Utility of Self-Supervised Pretraining Strategies for the Detection of Absent Lung Sliding in M-Mode Lung Ultrasound

Self-supervised pretraining has been observed to improve performance in ...
research
09/05/2023

A Survey of the Impact of Self-Supervised Pretraining for Diagnostic Tasks with Radiological Images

Self-supervised pretraining has been observed to be effective at improvi...
research
08/23/2021

How Transferable Are Self-supervised Features in Medical Image Classification Tasks?

Transfer learning has become a standard practice to mitigate the lack of...
research
06/07/2023

Automatic retrieval of corresponding US views in longitudinal examinations

Skeletal muscle atrophy is a common occurrence in critically ill patient...
research
10/11/2020

Learning Which Features Matter: RoBERTa Acquires a Preference for Linguistic Generalizations (Eventually)

One reason pretraining on self-supervised linguistic tasks is effective ...
research
06/16/2022

Learning Generic Lung Ultrasound Biomarkers for Decoupling Feature Extraction from Downstream Tasks

Contemporary artificial neural networks (ANN) are trained end-to-end, jo...
research
10/24/2022

Human-centered XAI for Burn Depth Characterization

Approximately 1.25 million people in the United States are treated each ...

Please sign up or login with your details

Forgot password? Click here to reset