Learning Robust and Consistent Time Series Representations: A Dilated Inception-Based Approach

06/11/2023
by   Anh Duy Nguyen, et al.
0

Representation learning for time series has been an important research area for decades. Since the emergence of the foundation models, this topic has attracted a lot of attention in contrastive self-supervised learning, to solve a wide range of downstream tasks. However, there have been several challenges for contrastive time series processing. First, there is no work considering noise, which is one of the critical factors affecting the efficacy of time series tasks. Second, there is a lack of efficient yet lightweight encoder architectures that can learn informative representations robust to various downstream tasks. To fill in these gaps, we initiate a novel sampling strategy that promotes consistent representation learning with the presence of noise in natural time series. In addition, we propose an encoder architecture that utilizes dilated convolution within the Inception block to create a scalable and robust network architecture with a wide receptive field. Experiments demonstrate that our method consistently outperforms state-of-the-art methods in forecasting, classification, and abnormality detection tasks, e.g. ranks first over two-thirds of the classification UCR datasets, with only 40% of the parameters compared to the second-best approach. Our source code for CoInception framework is accessible at https://github.com/anhduy0911/CoInception.

READ FULL TEXT
research
03/02/2023

Multi-Task Self-Supervised Time-Series Representation Learning

Time-series representation learning can extract representations from dat...
research
05/30/2023

Contrastive Shapelet Learning for Unsupervised Multivariate Time Series Representation Learning

Recent studies have shown great promise in unsupervised representation l...
research
08/24/2023

A Co-training Approach for Noisy Time Series Learning

In this work, we focus on robust time series representation learning. Ou...
research
06/12/2023

Correlated Time Series Self-Supervised Representation Learning via Spatiotemporal Bootstrapping

Correlated time series analysis plays an important role in many real-wor...
research
07/15/2022

A Probabilistic Autoencoder for Type Ia Supernovae Spectral Time Series

We construct a physically-parameterized probabilistic autoencoder (PAE) ...
research
02/03/2022

CoST: Contrastive Learning of Disentangled Seasonal-Trend Representations for Time Series Forecasting

Deep learning has been actively studied for time series forecasting, and...
research
02/08/2022

Unsupervised Time-Series Representation Learning with Iterative Bilinear Temporal-Spectral Fusion

Unsupervised/self-supervised time series representation learning is a ch...

Please sign up or login with your details

Forgot password? Click here to reset