Self-supervised Transformer for Multivariate Clinical Time-Series with Missing Values

07/29/2021
by   Sindhu Tipirneni, et al.
0

Multivariate time-series (MVTS) data are frequently observed in critical care settings and are typically characterized by excessive missingness and irregular time intervals. Existing approaches for learning representations in this domain handle such issues by either aggregation or imputation of values, which in-turn suppresses the fine-grained information and adds undesirable noise/overhead into the machine learning model. To tackle this challenge, we propose STraTS (Self-supervised Transformer for TimeSeries) model which bypasses these pitfalls by treating time-series as a set of observation triplets instead of using the traditional dense matrix representation. It employs a novel Continuous Value Embedding (CVE) technique to encode continuous time and variable values without the need for discretization. It is composed of a Transformer component with Multi-head attention layers which enables it to learn contextual triplet embeddings while avoiding problems of recurrence and vanishing gradients that occur in recurrent architectures. Many healthcare datasets also suffer from the limited availability of labeled data. Our model utilizes self-supervision by leveraging unlabeled data to learn better representations by performing time-series forecasting as a self-supervision task. Experiments on real-world multivariate clinical time-series benchmark datasets show that STraTS shows better prediction performance than state-of-the-art methods for mortality prediction, especially when labeled data is limited. Finally, we also present an interpretable version of STraTS which can identify important measurements in the time-series data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/06/2020

A Transformer-based Framework for Multivariate Time Series Representation Learning

In this work we propose for the first time a transformer-based framework...
research
01/25/2021

Multi-view Integration Learning for Irregularly-sampled Clinical Time Series

Electronic health record (EHR) data is sparse and irregular as it is rec...
research
10/29/2022

Self-supervised predictive coding and multimodal fusion advance patient deterioration prediction in fine-grained time resolution

In the Emergency Department (ED), accurate prediction of critical events...
research
04/30/2019

Multi-resolution Networks For Flexible Irregular Time Series Modeling (Multi-FIT)

Missing values, irregularly collected samples, and multi-resolution sign...
research
07/06/2022

Don't Pay Attention to the Noise: Learning Self-supervised Representations of Light Curves with a Denoising Time Series Transformer

Astrophysical light curves are particularly challenging data objects due...

Please sign up or login with your details

Forgot password? Click here to reset