Self-supervised Contrastive Representation Learning for Semi-supervised Time-Series Classification

08/13/2022
by   Emadeldeen Eldele, et al.
0

Learning time-series representations when only unlabeled data or few labeled samples are available can be a challenging task. Recently, contrastive self-supervised learning has shown great improvement in extracting useful representations from unlabeled data via contrasting different augmented views of data. In this work, we propose a novel Time-Series representation learning framework via Temporal and Contextual Contrasting (TS-TCC) that learns representations from unlabeled data with contrastive learning. Specifically, we propose time-series specific weak and strong augmentations and use their views to learn robust temporal relations in the proposed temporal contrasting module, besides learning discriminative representations by our proposed contextual contrasting module. Additionally, we conduct a systematic study of time-series data augmentation selection, which is a key part of contrastive learning. We also extend TS-TCC to the semi-supervised learning settings and propose a Class-Aware TS-TCC (CA-TCC) that benefits from the available few labeled data to further improve representations learned by TS-TCC. Specifically, we leverage robust pseudo labels produced by TS-TCC to realize class-aware contrastive loss. Extensive experiments show that the linear evaluation of the features learned by our proposed framework performs comparably with the fully supervised training. Additionally, our framework shows high efficiency in few labeled data and transfer learning scenarios. The code is publicly available at <https://github.com/emadeldeen24/TS-TCC>.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/26/2021

Time-Series Representation Learning via Temporal and Contextual Contrasting

Learning decent representations from unlabeled time-series data with tem...
research
03/02/2023

Multi-Task Self-Supervised Time-Series Representation Learning

Time-series representation learning can extract representations from dat...
research
10/13/2022

LEAVES: Learning Views for Time-Series Data in Contrastive Learning

Contrastive learning, a self-supervised learning method that can learn r...
research
03/17/2022

Mixing Up Contrastive Learning: Self-Supervised Representation Learning for Time Series

The lack of labeled data is a key challenge for learning useful represen...
research
06/19/2021

Learning Timestamp-Level Representations for Time Series with Hierarchical Contrastive Loss

This paper presents TS2Vec, a universal framework for learning timestamp...
research
11/27/2020

Self-Supervised Time Series Representation Learning by Inter-Intra Relational Reasoning

Self-supervised learning achieves superior performance in many domains b...
research
04/09/2023

Embarrassingly Simple MixUp for Time-series

Labeling time series data is an expensive task because of domain experti...

Please sign up or login with your details

Forgot password? Click here to reset