Hodge-Aware Contrastive Learning

09/14/2023
by   Alexander Möllers, et al.
0

Simplicial complexes prove effective in modeling data with multiway dependencies, such as data defined along the edges of networks or within other higher-order structures. Their spectrum can be decomposed into three interpretable subspaces via the Hodge decomposition, resulting foundational in numerous applications. We leverage this decomposition to develop a contrastive self-supervised learning approach for processing simplicial data and generating embeddings that encapsulate specific spectral information.Specifically, we encode the pertinent data invariances through simplicial neural networks and devise augmentations that yield positive contrastive examples with suitable spectral properties for downstream tasks. Additionally, we reweight the significance of negative examples in the contrastive loss, considering the similarity of their Hodge components to the anchor. By encouraging a stronger separation among less similar instances, we obtain an embedding space that reflects the spectral properties of the data. The numerical results on two standard edge flow classification tasks show a superior performance even when compared to supervised learning techniques. Our findings underscore the importance of adopting a spectral perspective for contrastive learning with higher-order data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/31/2020

A Survey on Contrastive Self-supervised Learning

Self-supervised learning has gained popularity because of its ability to...
research
05/26/2022

Triangular Contrastive Learning on Molecular Graphs

Recent contrastive learning methods have shown to be effective in variou...
research
06/24/2023

Structuring Representation Geometry with Rotationally Equivariant Contrastive Learning

Self-supervised learning converts raw perceptual data such as images to ...
research
07/19/2022

Uncertainty in Contrastive Learning: On the Predictability of Downstream Performance

The superior performance of some of today's state-of-the-art deep learni...
research
05/02/2022

FastGCL: Fast Self-Supervised Learning on Graphs via Contrastive Neighborhood Aggregation

Graph contrastive learning (GCL), as a popular approach to graph self-su...
research
05/17/2023

Sharpness Shift-Aware Self-Supervised Learning

Self-supervised learning aims to extract meaningful features from unlabe...
research
05/23/2022

Contrastive and Non-Contrastive Self-Supervised Learning Recover Global and Local Spectral Embedding Methods

Self-Supervised Learning (SSL) surmises that inputs and pairwise positiv...

Please sign up or login with your details

Forgot password? Click here to reset