FormerTime: Hierarchical Multi-Scale Representations for Multivariate Time Series Classification

02/20/2023
by   Mingyue Cheng, et al.
0

Deep learning-based algorithms, e.g., convolutional networks, have significantly facilitated multivariate time series classification (MTSC) task. Nevertheless, they suffer from the limitation in modeling long-range dependence due to the nature of convolution operations. Recent advancements have shown the potential of transformers to capture long-range dependence. However, it would incur severe issues, such as fixed scale representations, temporal-invariant and quadratic time complexity, with transformers directly applicable to the MTSC task because of the distinct properties of time series data. To tackle these issues, we propose FormerTime, an hierarchical representation model for improving the classification capacity for the MTSC task. In the proposed FormerTime, we employ a hierarchical network architecture to perform multi-scale feature maps. Besides, a novel transformer encoder is further designed, in which an efficient temporal reduction attention layer and a well-informed contextual positional encoding generating strategy are developed. To sum up, FormerTime exhibits three aspects of merits: (1) learning hierarchical multi-scale representations from time series data, (2) inheriting the strength of both transformers and convolutional networks, and (3) tacking the efficiency challenges incurred by the self-attention mechanism. Extensive experiments performed on 10 publicly available datasets from UEA archive verify the superiorities of the FormerTime compared to previous competitive baselines.

READ FULL TEXT
research
05/16/2022

Multi-scale Attention Flow for Probabilistic Time Series Forecasting

The probability prediction of multivariate time series is a notoriously ...
research
02/03/2022

ETSformer: Exponential Smoothing Transformers for Time-series Forecasting

Transformers have been actively studied for time-series forecasting in r...
research
10/26/2020

Peak Detection On Data Independent Acquisition Mass Spectrometry Data With Semisupervised Convolutional Transformers

Liquid Chromatography coupled to Mass Spectrometry (LC-MS) based methods...
research
02/13/2023

Enhancing Multivariate Time Series Classifiers through Self-Attention and Relative Positioning Infusion

Time Series Classification (TSC) is an important and challenging task fo...
research
10/05/2021

Attention Augmented Convolutional Transformer for Tabular Time-series

Time-series classification is one of the most frequently performed tasks...
research
10/31/2019

Deep convolutional neural networks for multi-scale time-series classification and application to disruption prediction in fusion devices

The multi-scale, mutli-physics nature of fusion plasmas makes predicting...
research
06/16/2015

Deep Convolutional Networks on Graph-Structured Data

Deep Learning's recent successes have mostly relied on Convolutional Net...

Please sign up or login with your details

Forgot password? Click here to reset