Efficient Representation Learning for Healthcare with Cross-Architectural Self-Supervision

08/19/2023
by   Pranav Singh, et al.
0

In healthcare and biomedical applications, extreme computational requirements pose a significant barrier to adopting representation learning. Representation learning can enhance the performance of deep learning architectures by learning useful priors from limited medical data. However, state-of-the-art self-supervised techniques suffer from reduced performance when using smaller batch sizes or shorter pretraining epochs, which are more practical in clinical settings. We present Cross Architectural - Self Supervision (CASS) in response to this challenge. This novel siamese self-supervised learning approach synergistically leverages Transformer and Convolutional Neural Networks (CNN) for efficient learning. Our empirical evaluation demonstrates that CASS-trained CNNs and Transformers outperform existing self-supervised learning methods across four diverse healthcare datasets. With only 1 finetuning, CASS achieves a 3.8 gains 5.9 enhancement. Notably, CASS reduces pretraining time by 69 state-of-the-art methods, making it more amenable to clinical implementation. We also demonstrate that CASS is considerably more robust to variations in batch size and pretraining epochs, making it a suitable candidate for machine learning in healthcare applications.

READ FULL TEXT

page 29

page 30

page 31

page 32

page 33

page 34

page 35

research
01/27/2023

Cross-Architectural Positive Pairs improve the effectiveness of Self-Supervised Learning

Existing self-supervised techniques have extreme computational requireme...
research
07/24/2020

Real-World Multi-Domain Data Applications for Generalizations to Clinical Settings

With promising results of machine learning based models in computer visi...
research
06/08/2022

CASS: Cross Architectural Self-Supervision for Medical Image Analysis

Recent advances in Deep Learning and Computer Vision have alleviated man...
research
06/07/2022

TriBYOL: Triplet BYOL for Self-Supervised Representation Learning

This paper proposes a novel self-supervised learning method for learning...
research
10/13/2022

The Hidden Uniform Cluster Prior in Self-Supervised Learning

A successful paradigm in representation learning is to perform self-supe...
research
05/13/2023

How to Train Your CheXDragon: Training Chest X-Ray Models for Transfer to Novel Tasks and Healthcare Systems

Self-supervised learning (SSL) enables label efficient training for mach...
research
08/29/2022

SB-SSL: Slice-Based Self-Supervised Transformers for Knee Abnormality Classification from MRI

The availability of large scale data with high quality ground truth labe...

Please sign up or login with your details

Forgot password? Click here to reset