Time Series Contrastive Learning with Information-Aware Augmentations

03/21/2023
by   Dongsheng Luo, et al.
2

Various contrastive learning approaches have been proposed in recent years and achieve significant empirical success. While effective and prevalent, contrastive learning has been less explored for time series data. A key component of contrastive learning is to select appropriate augmentations imposing some priors to construct feasible positive samples, such that an encoder can be trained to learn robust and discriminative representations. Unlike image and language domains where “desired” augmented samples can be generated with the rule of thumb guided by prefabricated human priors, the ad-hoc manual selection of time series augmentations is hindered by their diverse and human-unrecognizable temporal structures. How to find the desired augmentations of time series data that are meaningful for given contrastive learning tasks and datasets remains an open question. In this work, we address the problem by encouraging both high fidelity and variety based upon information theory. A theoretical analysis leads to the criteria for selecting feasible data augmentations. On top of that, we propose a new contrastive learning approach with information-aware augmentations, InfoTS, that adaptively selects optimal augmentations for time series representation learning. Experiments on various datasets show highly competitive performance with up to 12.0% reduction in MSE on forecasting tasks and up to 3.7% relative improvement in accuracy on classification tasks over the leading baselines.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/31/2023

SimTS: Rethinking Contrastive Representation Learning for Time Series Forecasting

Contrastive learning methods have shown an impressive ability to learn m...
research
06/23/2022

Utilizing Expert Features for Contrastive Learning of Time-Series Representations

We present an approach that incorporates expert knowledge for time-serie...
research
03/17/2022

Mixing Up Contrastive Learning: Self-Supervised Representation Learning for Time Series

The lack of labeled data is a key challenge for learning useful represen...
research
10/28/2021

InfoGCL: Information-Aware Graph Contrastive Learning

Various graph contrastive learning models have been proposed to improve ...
research
08/03/2023

Unsupervised Representation Learning for Time Series: A Review

Unsupervised representation learning approaches aim to learn discriminat...
research
07/06/2022

Don't overfit the history – Recursive time series data augmentation

Time series observations can be seen as realizations of an underlying dy...
research
03/09/2023

Meta contrastive label correction for financial time series

Financial applications such as stock price forecasting, usually face an ...

Please sign up or login with your details

Forgot password? Click here to reset