HomoGCL: Rethinking Homophily in Graph Contrastive Learning

06/16/2023
by   Wen-Zhi Li, et al.
0

Contrastive learning (CL) has become the de-facto learning paradigm in self-supervised learning on graphs, which generally follows the "augmenting-contrasting" learning scheme. However, we observe that unlike CL in computer vision domain, CL in graph domain performs decently even without augmentation. We conduct a systematic analysis of this phenomenon and argue that homophily, i.e., the principle that "like attracts like", plays a key role in the success of graph CL. Inspired to leverage this property explicitly, we propose HomoGCL, a model-agnostic framework to expand the positive set using neighbor nodes with neighbor-specific significances. Theoretically, HomoGCL introduces a stricter lower bound of the mutual information between raw node features and node embeddings in augmented views. Furthermore, HomoGCL can be combined with existing graph CL models in a plug-and-play way with light extra computational overhead. Extensive experiments demonstrate that HomoGCL yields multiple state-of-the-art results across six public datasets and consistently brings notable performance improvements when applied to various graph CL methods. Code is avilable at https://github.com/wenzhilics/HomoGCL.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/05/2021

Augmentation-Free Self-Supervised Learning on Graphs

Inspired by the recent success of self-supervised methods applied on ima...
research
12/14/2022

MA-GCL: Model Augmentation Tricks for Graph Contrastive Learning

Contrastive learning (CL), which can extract the information shared betw...
research
12/13/2022

Coarse-to-Fine Contrastive Learning on Graphs

Inspired by the impressive success of contrastive learning (CL), a varie...
research
10/01/2022

Heterogeneous Graph Contrastive Multi-view Learning

Inspired by the success of contrastive learning (CL) in computer vision ...
research
07/20/2023

The Role of Entropy and Reconstruction in Multi-View Self-Supervised Learning

The mechanisms behind the success of multi-view self-supervised learning...
research
06/06/2023

Randomized Schur Complement Views for Graph Contrastive Learning

We introduce a randomized topological augmentor based on Schur complemen...
research
10/05/2022

Revisiting Graph Contrastive Learning from the Perspective of Graph Spectrum

Graph Contrastive Learning (GCL), learning the node representations by a...

Please sign up or login with your details

Forgot password? Click here to reset