Self-supervised Pretraining and Transfer Learning Enable Flu and COVID-19 Predictions in Small Mobile Sensing Datasets

05/26/2022
by   Mike A. Merrill, et al.
5

Detailed mobile sensing data from phones, watches, and fitness trackers offer an unparalleled opportunity to quantify and act upon previously unmeasurable behavioral changes in order to improve individual health and accelerate responses to emerging diseases. Unlike in natural language processing and computer vision, deep representation learning has yet to broadly impact this domain, in which the vast majority of research and clinical applications still rely on manually defined features and boosted tree models or even forgo predictive modeling altogether due to insufficient accuracy. This is due to unique challenges in the behavioral health domain, including very small datasets ( 10^1 participants), which frequently contain missing data, consist of long time series with critical long-range dependencies (length>10^4), and extreme class imbalances (>10^3:1). Here, we introduce a neural architecture for multivariate time series classification designed to address these unique domain challenges. Our proposed behavioral representation learning approach combines novel tasks for self-supervised pretraining and transfer learning to address data scarcity, and captures long-range dependencies across long-history time series through transformer self-attention following convolutional neural network-based dimensionality reduction. We propose an evaluation framework aimed at reflecting expected real-world performance in plausible deployment scenarios. Concretely, we demonstrate (1) performance improvements over baselines of up to 0.15 ROC AUC across five prediction tasks, (2) transfer learning-induced performance improvements of 16 scenarios, and (3) the potential of transfer learning in novel disease scenarios through an exploratory case study of zero-shot COVID-19 prediction in an independent data set. Finally, we discuss potential implications for medical surveillance testing.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/09/2021

Transformer-Based Behavioral Representation Learning Enables Transfer Learning for Mobile Sensing in Small Datasets

While deep learning has revolutionized research and applications in NLP ...
research
10/04/2022

MTSMAE: Masked Autoencoders for Multivariate Time-Series Forecasting

Large-scale self-supervised pre-training Transformer architecture have s...
research
03/26/2021

Multi-source Transfer Learning with Ensemble for Financial Time Series Forecasting

Although transfer learning is proven to be effective in computer vision ...
research
05/01/2023

Self-supervised Activity Representation Learning with Incremental Data: An Empirical Study

In the context of mobile sensing environments, various sensors on mobile...
research
06/23/2023

Variance-Covariance Regularization Improves Representation Learning

Transfer learning has emerged as a key approach in the machine learning ...
research
08/26/2023

Multivariate time series classification with dual attention network

One of the topics in machine learning that is becoming more and more rel...

Please sign up or login with your details

Forgot password? Click here to reset