DeepAI AI Chat
Log In Sign Up

Time Series Generation with Masked Autoencoder

01/14/2022
by   Mengyue Zha, et al.
The Hong Kong University of Science and Technology
0

This paper shows that masked autoencoders with interpolators (InterpoMAE) are scalable self-supervised generators for time series. InterpoMAE masks random patches from the input time series and restore the missing patches in the latent space by an interpolator. The core design is that InterpoMAE uses an interpolator rather than mask tokens to restore the latent representations for missing patches in the latent space. This design enables more efficient and effective capture of temporal dynamics with bidirectional information. InterpoMAE allows for explicit control on the diversity of synthetic data by changing the size and number of masked patches. Our approach consistently and significantly outperforms state-of-the-art (SoTA) benchmarks of unsupervised learning in time series generation on several real datasets. Synthetic data produced show promising scaling behavior in various downstream tasks such as data augmentation, imputation and denoise.

READ FULL TEXT

page 1

page 2

page 3

page 4

11/11/2021

Masked Autoencoders Are Scalable Vision Learners

This paper shows that masked autoencoders (MAE) are scalable self-superv...
11/16/2021

Towards Generating Real-World Time Series Data

Time series data generation has drawn increasing attention in recent yea...
05/27/2022

Group GAN

Generating multivariate time series is a promising approach for sharing ...
03/12/2023

Improving Masked Autoencoders by Learning Where to Mask

Masked image modeling is a promising self-supervised learning method for...
08/23/2021

DTWSSE: Data Augmentation with a Siamese Encoder for Time Series

Access to labeled time series data is often limited in the real world, w...
09/16/2022

DBT-DMAE: An Effective Multivariate Time Series Pre-Train Model under Missing Data

Multivariate time series(MTS) is a universal data type related to many p...
07/15/2022

A Probabilistic Autoencoder for Type Ia Supernovae Spectral Time Series

We construct a physically-parameterized probabilistic autoencoder (PAE) ...