DeepAI AI Chat
Log In Sign Up

One Transformer for All Time Series: Representing and Training with Time-Dependent Heterogeneous Tabular Data

by   Simone Luetto, et al.

There is a recent growing interest in applying Deep Learning techniques to tabular data, in order to replicate the success of other Artificial Intelligence areas in this structured domain. Specifically interesting is the case in which tabular data have a time dependence, such as, for instance financial transactions. However, the heterogeneity of the tabular values, in which categorical elements are mixed with numerical items, makes this adaptation difficult. In this paper we propose a Transformer architecture to represent heterogeneous time-dependent tabular data, in which numerical features are represented using a set of frequency functions and the whole network is uniformly trained with a unique loss function.


page 1

page 2

page 3

page 4


Forecaster: A Graph Transformer for Forecasting Spatial and Time-Dependent Data

Spatial and time-dependent data is of interest in many applications. Thi...

Deep Transformer Networks for Time Series Classification: The NPP Safety Case

A challenging part of dynamic probabilistic risk assessment for nuclear ...

Generating virtual scenarios of multivariate financial data for quantitative trading applications

In this paper, we present a novel approach to the generation of virtual ...

A Spectral Enabled GAN for Time Series Data Generation

Time dependent data is a main source of information in today's data driv...

A Temporal Type-2 Fuzzy System for Time-dependent Explainable Artificial Intelligence

Explainable Artificial Intelligence (XAI) is a paradigm that delivers tr...

A fourth-order compact time-splitting method for the Dirac equation with time-dependent potentials

In this paper, we present an approach to deal with the dynamics of the D...

Towards a context-dependent numerical data quality evaluation framework

This paper focuses on numeric data, with emphasis on distinct characteri...