Can Single-Pass Contrastive Learning Work for Both Homophilic and Heterophilic Graph?

11/20/2022
by   Haonan Wang, et al.
0

Existing graph contrastive learning (GCL) typically requires two forward pass for a single instance to construct the contrastive loss. Despite its remarkable success, it is unclear whether such a dual-pass design is (theoretically) necessary. Besides, the empirical results are hitherto limited to the homophilic graph benchmarks. Then a natural question arises: Can we design a method that works for both homophilic and heterophilic graphs with a performance guarantee? To answer this, we analyze the concentration property of features obtained by neighborhood aggregation on both homophilic and heterophilic graphs, introduce the single-pass graph contrastive learning loss based on the property, and provide performance guarantees of the minimizer of the loss on downstream tasks. As a direct consequence of our analysis, we implement the Single-Pass Graph Contrastive Learning method (SP-GCL). Empirically, on 14 benchmark datasets with varying degrees of heterophily, the features learned by the SP-GCL can match or outperform existing strong baselines with significantly less computational overhead, which verifies the usefulness of our findings in real-world cases.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/11/2022

Augmentation-Free Graph Contrastive Learning

Graph contrastive learning (GCL) is the most representative and prevalen...
research
05/20/2020

Understanding Contrastive Representation Learning through Alignment and Uniformity on the Hypersphere

Contrastive representation learning has been outstandingly successful in...
research
12/16/2022

Feature Dropout: Revisiting the Role of Augmentations in Contrastive Learning

What role do augmentations play in contrastive learning? Recent work sug...
research
06/08/2021

Provable Guarantees for Self-Supervised Deep Learning with Spectral Contrastive Loss

Recent works in self-supervised learning have advanced the state-of-the-...
research
12/15/2021

Bayesian Graph Contrastive Learning

Contrastive learning has become a key component of self-supervised learn...
research
06/16/2022

Let Invariant Rationale Discovery Inspire Graph Contrastive Learning

Leading graph contrastive learning (GCL) methods perform graph augmentat...
research
09/28/2022

Efficient block contrastive learning via parameter-free meta-node approximation

Contrastive learning has recently achieved remarkable success in many do...

Please sign up or login with your details

Forgot password? Click here to reset