MTSMAE: Masked Autoencoders for Multivariate Time-Series Forecasting

10/04/2022
by   Peiwang Tang, et al.
0

Large-scale self-supervised pre-training Transformer architecture have significantly boosted the performance for various tasks in natural language processing (NLP) and computer vision (CV). However, there is a lack of researches on processing multivariate time-series by pre-trained Transformer, and especially, current study on masking time-series for self-supervised learning is still a gap. Different from language and image processing, the information density of time-series increases the difficulty of research. The challenge goes further with the invalidity of the previous patch embedding and mask methods. In this paper, according to the data characteristics of multivariate time-series, a patch embedding method is proposed, and we present an self-supervised pre-training approach based on Masked Autoencoders (MAE), called MTSMAE, which can improve the performance significantly over supervised learning without pre-training. Evaluating our method on several common multivariate time-series datasets from different fields and with different characteristics, experiment results demonstrate that the performance of our method is significantly better than the best method currently available.

READ FULL TEXT

page 1

page 2

page 3

research
11/27/2022

A Time Series is Worth 64 Words: Long-term Forecasting with Transformers

We propose an efficient design of Transformer-based models for multivari...
research
10/06/2020

A Transformer-based Framework for Multivariate Time Series Representation Learning

In this work we propose for the first time a transformer-based framework...
research
08/19/2023

Forecast-MAE: Self-supervised Pre-training for Motion Forecasting with Masked Autoencoders

This study explores the application of self-supervised learning (SSL) to...
research
05/26/2022

Self-supervised Pretraining and Transfer Learning Enable Flu and COVID-19 Predictions in Small Mobile Sensing Datasets

Detailed mobile sensing data from phones, watches, and fitness trackers ...
research
05/18/2023

A Survey on Time-Series Pre-Trained Models

Time-Series Mining (TSM) is an important research area since it shows gr...
research
02/28/2023

Efficient Masked Autoencoders with Self-Consistency

Inspired by masked language modeling (MLM) in natural language processin...
research
02/08/2022

How to Understand Masked Autoencoders

"Masked Autoencoders (MAE) Are Scalable Vision Learners" revolutionizes ...

Please sign up or login with your details

Forgot password? Click here to reset