CDSA: Cross-Dimensional Self-Attention for Multivariate, Geo-tagged Time Series Imputation

05/23/2019
by   Jiawei Ma, et al.
0

Many real-world applications involve multivariate, geo-tagged time series data: at each location, multiple sensors record corresponding measurements. For example, air quality monitoring system records PM2.5, CO, etc. The resulting time-series data often has missing values due to device outages or communication errors. In order to impute the missing values, state-of-the-art methods are built on Recurrent Neural Networks (RNN), which process each time stamp sequentially, prohibiting the direct modeling of the relationship between distant time stamps. Recently, the self-attention mechanism has been proposed for sequence modeling tasks such as machine translation, significantly outperforming RNN because the relationship between each two time stamps can be modeled explicitly. In this paper, we are the first to adapt the self-attention mechanism for multivariate, geo-tagged time series data. In order to jointly capture the self-attention across multiple dimensions, including time, location and the sensor measurements, while maintain low computational complexity, we propose a novel approach called Cross-Dimensional Self-Attention (CDSA) to process each dimension sequentially, yet in an order-independent manner. Our extensive experiments on four real-world datasets, including three standard benchmarks and our newly collected NYC-traffic dataset, demonstrate that our approach outperforms the state-of-the-art imputation and forecasting methods. A detailed systematic analysis confirms the effectiveness of our design choices.

READ FULL TEXT

page 2

page 3

research
02/17/2022

SAITS: Self-Attention-based Imputation for Time Series

Missing data in time series is a pervasive problem that puts obstacles i...
research
10/20/2020

RDIS: Random Drop Imputation with Self-Training for Incomplete Time Series Data

It is common that time-series data with missing values are encountered i...
research
08/23/2023

FOSA: Full Information Maximum Likelihood (FIML) Optimized Self-Attention Imputation for Missing Data

In data imputation, effectively addressing missing values is pivotal, es...
research
05/02/2020

Deep ConvLSTM with self-attention for human activity decoding using wearables

Decoding human activity accurately from wearable sensors can aid in appl...
research
01/25/2021

Multi-view Integration Learning for Irregularly-sampled Clinical Time Series

Electronic health record (EHR) data is sparse and irregular as it is rec...
research
10/26/2020

Peak Detection On Data Independent Acquisition Mass Spectrometry Data With Semisupervised Convolutional Transformers

Liquid Chromatography coupled to Mass Spectrometry (LC-MS) based methods...
research
10/07/2021

Predictive Maintenance for General Aviation Using Convolutional Transformers

Predictive maintenance systems have the potential to significantly reduc...

Please sign up or login with your details

Forgot password? Click here to reset