Tripletformer for Probabilistic Interpolation of Asynchronous Time Series

Asynchronous time series are often observed in several applications such as health care, astronomy, and climate science, and pose a significant challenge to the standard deep learning architectures. Interpolation of asynchronous time series is vital for many real-world tasks like root cause analysis, and medical diagnosis. In this paper, we propose a novel encoder-decoder architecture called Tripletformer, which works on the set of observations where each set element is a triple of time, channel, and value, for the probabilistic interpolation of the asynchronous time series. Both the encoder and the decoder of the Tripletformer are modeled using attention layers and fully connected layers and are invariant to the order in which set elements are presented. The proposed Tripletformer is compared with a range of baselines over multiple real-world and synthetic asynchronous time series datasets, and the experimental results attest that it produces more accurate and certain interpolations. We observe an improvement in negative loglikelihood error up to 33 compared to the state-of-the-art model using the Tripletformer.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/24/2022

DCSF: Deep Convolutional Set Functions for Classification of Asynchronous Time Series

Asynchronous Time Series is a multivariate time series where all the cha...
research
07/23/2017

Time Series Compression Based on Adaptive Piecewise Recurrent Autoencoder

Time series account for a large proportion of the data stored in financi...
research
04/19/2022

EXIT: Extrapolation and Interpolation-based Neural Controlled Differential Equations for Time-series Classification and Forecasting

Deep learning inspired by differential equations is a recent research tr...
research
06/06/2019

ZeLiC and ZeChipC: Time Series Interpolation Methods for Lebesgue or Event-based Sampling

Lebesgue sampling is based on collecting information depending on the va...
research
09/26/2019

Set Functions for Time Series

Despite the eminent successes of deep neural networks, many architecture...
research
03/28/2022

DAMNETS: A Deep Autoregressive Model for Generating Markovian Network Time Series

In this work, we introduce DAMNETS, a deep generative model for Markovia...
research
01/14/2022

Imputing Missing Observations with Time Sliced Synthetic Minority Oversampling Technique

We present a simple yet novel time series imputation technique with the ...

Please sign up or login with your details

Forgot password? Click here to reset