Gated Transformer Networks for Multivariate Time Series Classification

by   Minghao Liu, et al.

Deep learning model (primarily convolutional networks and LSTM) for time series classification has been studied broadly by the community with the wide applications in different domains like healthcare, finance, industrial engineering and IoT. Meanwhile, Transformer Networks recently achieved frontier performance on various natural language processing and computer vision tasks. In this work, we explored a simple extension of the current Transformer Networks with gating, named Gated Transformer Networks (GTN) for the multivariate time series classification problem. With the gating that merges two towers of Transformer which model the channel-wise and step-wise correlations respectively, we show how GTN is naturally and effectively suitable for the multivariate time series classification task. We conduct comprehensive experiments on thirteen dataset with full ablation study. Our results show that GTN is able to achieve competing results with current state-of-the-art deep learning models. We also explored the attention map for the natural interpretability of GTN on time series modeling. Our preliminary results provide a strong baseline for the Transformer Networks on multivariate time series classification task and grounds the foundation for future research.



There are no comments yet.


page 5


Deep Neural Network Ensembles for Time Series Classification

Deep neural networks have revolutionized many fields such as computer vi...

Paying Attention to Astronomical Transients: Photometric Classification with the Time-Series Transformer

Future surveys such as the Legacy Survey of Space and Time (LSST) of the...

A Transformer-based Framework for Multivariate Time Series Representation Learning

In this work we propose for the first time a transformer-based framework...

Approaches and Applications of Early Classification of Time Series: A Review

Early classification of time series has been extensively studied for min...

Soft Sensing Transformer: Hundreds of Sensors are Worth a Single Word

With the rapid development of AI technology in recent years, there have ...

Memory-Gated Recurrent Networks

The essence of multivariate sequential learning is all about how to extr...

Classification of multivariate weakly-labelled time-series with attention

This research identifies a gap in weakly-labelled multivariate time-seri...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.