Leveraging Time Irreversibility with Order-Contrastive Pre-training

11/04/2021
by   Monica Agrawal, et al.
0

Label-scarce, high-dimensional domains such as healthcare present a challenge for modern machine learning techniques. To overcome the difficulties posed by a lack of labeled data, we explore an "order-contrastive" method for self-supervised pre-training on longitudinal data. We sample pairs of time segments, switch the order for half of them, and train a model to predict whether a given pair is in the correct order. Intuitively, the ordering task allows the model to attend to the least time-reversible features (for example, features that indicate progression of a chronic disease). The same features are often useful for downstream tasks of interest. To quantify this, we study a simple theoretical setting where we prove a finite-sample guarantee for the downstream error of a representation learned with order-contrastive pre-training. Empirically, in synthetic and longitudinal healthcare settings, we demonstrate the effectiveness of order-contrastive pre-training in the small-data regime over supervised learning and other self-supervised pre-training baselines. Our results indicate that pre-training methods designed for particular classes of distributions and downstream tasks can improve the performance of self-supervised learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/07/2022

MimCo: Masked Image Modeling Pre-training with Contrastive Teacher

Recent masked image modeling (MIM) has received much attention in self-s...
research
07/20/2023

Sequential Multi-Dimensional Self-Supervised Learning for Clinical Time Series

Self-supervised learning (SSL) for clinical time series data has receive...
research
06/02/2021

SAINT: Improved Neural Networks for Tabular Data via Row Attention and Contrastive Pre-Training

Tabular data underpins numerous high-impact applications of machine lear...
research
06/10/2022

Is Self-Supervised Learning More Robust Than Supervised Learning?

Self-supervised contrastive learning is a powerful tool to learn visual ...
research
11/06/2021

Towards noise robust trigger-word detection with contrastive learning pre-task for fast on-boarding of new trigger-words

Trigger-word detection plays an important role as the entry point of use...
research
06/01/2023

On Masked Pre-training and the Marginal Likelihood

Masked pre-training removes random input dimensions and learns a model t...
research
10/28/2022

Spectrograms Are Sequences of Patches

Self-supervised pre-training models have been used successfully in sever...

Please sign up or login with your details

Forgot password? Click here to reset