Improving Position Encoding of Transformers for Multivariate Time Series Classification

05/26/2023
by   Navid Mohammadi Foumani, et al.
0

Transformers have demonstrated outstanding performance in many applications of deep learning. When applied to time series data, transformers require effective position encoding to capture the ordering of the time series data. The efficacy of position encoding in time series analysis is not well-studied and remains controversial, e.g., whether it is better to inject absolute position encoding or relative position encoding, or a combination of them. In order to clarify this, we first review existing absolute and relative position encoding methods when applied in time series classification. We then proposed a new absolute position encoding method dedicated to time series data called time Absolute Position Encoding (tAPE). Our new method incorporates the series length and input embedding dimension in absolute position encoding. Additionally, we propose computationally Efficient implementation of Relative Position Encoding (eRPE) to improve generalisability for time series. We then propose a novel multivariate time series classification (MTSC) model combining tAPE/eRPE and convolution-based input encoding named ConvTran to improve the position and data embedding of time series data. The proposed absolute and relative position encoding methods are simple and efficient. They can be easily integrated into transformer blocks and used for downstream tasks such as forecasting, extrinsic regression, and anomaly detection. Extensive experiments on 32 multivariate time-series datasets show that our model is significantly more accurate than state-of-the-art convolution and transformer-based models. Code and models are open-sourced at <https://github.com/Navidfoumani/ConvTran>.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/29/2021

Rethinking and Improving Relative Position Encoding for Vision Transformer

Relative position encoding (RPE) is important for transformer to capture...
research
10/05/2021

Attention Augmented Convolutional Transformer for Tabular Time-series

Time-series classification is one of the most frequently performed tasks...
research
05/31/2023

The Impact of Positional Encoding on Length Generalization in Transformers

Length generalization, the ability to generalize from small training con...
research
05/02/2021

TE-ESN: Time Encoding Echo State Network for Prediction Based on Irregularly Sampled Time Series Data

Prediction based on Irregularly Sampled Time Series (ISTS) is of wide co...
research
12/03/2021

A Novel Deep Parallel Time-series Relation Network for Fault Diagnosis

Considering the models that apply the contextual information of time-ser...
research
09/13/2021

SHAPE: Shifted Absolute Position Embedding for Transformers

Position representation is crucial for building position-aware represent...
research
09/13/2021

Predicting the outcome of team movements – Player time series analysis using fuzzy and deep methods for representation learning

We extract and use player position time-series data, tagged along with t...

Please sign up or login with your details

Forgot password? Click here to reset