Augmentation-Free Graph Contrastive Learning

04/11/2022
by   Haonan Wang, et al.
15

Graph contrastive learning (GCL) is the most representative and prevalent self-supervised learning approach for graph-structured data. Despite its remarkable success, existing GCL methods highly rely on an augmentation scheme to learn the representations invariant across different augmentation views. In this work, we revisit such a convention in GCL through examining the effect of augmentation techniques on graph data via the lens of spectral theory. We found that graph augmentations preserve the low-frequency components and perturb the middle- and high-frequency components of the graph, which contributes to the success of GCL algorithms on homophilic graphs but hinders its application on heterophilic graphs, due to the high-frequency preference of heterophilic data. Motivated by this, we propose a novel, theoretically-principled, and augmentation-free GCL method, named AF-GCL, that (1) leverages the features aggregated by Graph Neural Network to construct the self-supervision signal instead of augmentations and therefore (2) is less sensitive to the graph homophily degree. Theoretically, We present the performance guarantee for AF-GCL as well as an analysis for understanding the efficacy of AF-GCL. Extensive experiments on 14 benchmark datasets with varying degrees of heterophily show that AF-GCL presents competitive or better performance on homophilic graphs and outperforms all existing state-of-the-art GCL methods on heterophilic graphs with significantly less computational overhead.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/05/2021

Augmentation-Free Self-Supervised Learning on Graphs

Inspired by the recent success of self-supervised methods applied on ima...
research
10/05/2022

Revisiting Graph Contrastive Learning from the Perspective of Graph Spectrum

Graph Contrastive Learning (GCL), learning the node representations by a...
research
11/20/2022

Can Single-Pass Contrastive Learning Work for Both Homophilic and Heterophilic Graph?

Existing graph contrastive learning (GCL) typically requires two forward...
research
05/02/2022

FastGCL: Fast Self-Supervised Learning on Graphs via Contrastive Neighborhood Aggregation

Graph contrastive learning (GCL), as a popular approach to graph self-su...
research
10/02/2022

Spectral Augmentation for Self-Supervised Learning on Graphs

Graph contrastive learning (GCL), as an emerging self-supervised learnin...
research
09/02/2021

An Empirical Study of Graph Contrastive Learning

Graph Contrastive Learning (GCL) establishes a new paradigm for learning...
research
10/06/2022

Uncovering the Structural Fairness in Graph Contrastive Learning

Recent studies show that graph convolutional network (GCN) often perform...

Please sign up or login with your details

Forgot password? Click here to reset