MHCCL: Masked Hierarchical Cluster-wise Contrastive Learning for Multivariate Time Series

12/02/2022
by   Qianwen Meng, et al.
0

Learning semantic-rich representations from raw unlabeled time series data is critical for downstream tasks such as classification and forecasting. Contrastive learning has recently shown its promising representation learning capability in the absence of expert annotations. However, existing contrastive approaches generally treat each instance independently, which leads to false negative pairs that share the same semantics. To tackle this problem, we propose MHCCL, a Masked Hierarchical Cluster-wise Contrastive Learning model, which exploits semantic information obtained from the hierarchical structure consisting of multiple latent partitions for multivariate time series. Motivated by the observation that fine-grained clustering preserves higher purity while coarse-grained one reflects higher-level semantics, we propose a novel downward masking strategy to filter out fake negatives and supplement positives by incorporating the multi-granularity information from the clustering hierarchy. In addition, a novel upward masking strategy is designed in MHCCL to remove outliers of clusters at each partition to refine prototypes, which helps speed up the hierarchical clustering process and improves the clustering quality. We conduct experimental evaluations on seven widely-used multivariate time series datasets. The results demonstrate the superiority of MHCCL over the state-of-the-art approaches for unsupervised time series representation learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/29/2022

Deep Temporal Contrastive Clustering

Recently the deep learning has shown its advantage in representation lea...
research
05/30/2023

Contrastive Shapelet Learning for Unsupervised Multivariate Time Series Representation Learning

Recent studies have shown great promise in unsupervised representation l...
research
02/01/2022

HCSC: Hierarchical Contrastive Selective Coding

Hierarchical semantic structures naturally exist in an image dataset, in...
research
06/13/2022

Contrastive Learning for Unsupervised Domain Adaptation of Time Series

Unsupervised domain adaptation (UDA) aims at learning a machine learning...
research
04/23/2023

Capturing Fine-grained Semantics in Contrastive Graph Representation Learning

Graph contrastive learning defines a contrastive task to pull similar in...
research
02/08/2022

Unsupervised Time-Series Representation Learning with Iterative Bilinear Temporal-Spectral Fusion

Unsupervised/self-supervised time series representation learning is a ch...
research
06/17/2019

Nested partitions from hierarchical clustering statistical validation

We develop a greedy algorithm that is fast and scalable in the detection...

Please sign up or login with your details

Forgot password? Click here to reset