Tabular Transformers for Modeling Multivariate Time Series

11/03/2020
by   Inkit Padhi, et al.
0

Tabular datasets are ubiquitous in data science applications. Given their importance, it seems natural to apply state-of-the-art deep learning algorithms in order to fully unlock their potential. Here we propose neural network models that represent tabular time series that can optionally leverage their hierarchical structure. This results in two architectures for tabular time series: one for learning representations that is analogous to BERT and can be pre-trained end-to-end and used in downstream tasks, and one that is akin to GPT and can be used for generation of realistic synthetic tabular sequences. We demonstrate our models on two datasets: a synthetic credit card transaction dataset, where the learned representations are used for fraud detection and synthetic data generation, and on a real pollution dataset, where the learned encodings are used to predict atmospheric pollutant concentrations. Code and data are available at https://github.com/IBM/TabFormer.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/05/2021

Attention Augmented Convolutional Transformer for Tabular Time-series

Time-series classification is one of the most frequently performed tasks...
research
08/04/2022

Customs Import Declaration Datasets

Given the huge volume of cross-border flows, effective and efficient con...
research
06/26/2023

Methodology for generating synthetic labeled datasets for visual container inspection

Nowadays, containerized freight transport is one of the most important t...
research
05/30/2020

Learning Efficient Representations of Mouse Movements to Predict User Attention

Tracking mouse cursor movements can be used to predict user attention on...
research
05/27/2022

Group GAN

Generating multivariate time series is a promising approach for sharing ...
research
11/18/2022

Let's Enhance: A Deep Learning Approach to Extreme Deblurring of Text Images

This work presents a novel deep-learning-based pipeline for the inverse ...
research
07/15/2022

A Probabilistic Autoencoder for Type Ia Supernovae Spectral Time Series

We construct a physically-parameterized probabilistic autoencoder (PAE) ...

Please sign up or login with your details

Forgot password? Click here to reset