Unifying Graph Contrastive Learning with Flexible Contextual Scopes

10/17/2022
by   Yizhen Zheng, et al.
0

Graph contrastive learning (GCL) has recently emerged as an effective learning paradigm to alleviate the reliance on labelling information for graph representation learning. The core of GCL is to maximise the mutual information between the representation of a node and its contextual representation (i.e., the corresponding instance with similar semantic information) summarised from the contextual scope (e.g., the whole graph or 1-hop neighbourhood). This scheme distils valuable self-supervision signals for GCL training. However, existing GCL methods still suffer from limitations, such as the incapacity or inconvenience in choosing a suitable contextual scope for different datasets and building biased contrastiveness. To address aforementioned problems, we present a simple self-supervised learning method termed Unifying Graph Contrastive Learning with Flexible Contextual Scopes (UGCL for short). Our algorithm builds flexible contextual representations with tunable contextual scopes by controlling the power of an adjacency matrix. Additionally, our method ensures contrastiveness is built within connected components to reduce the bias of contextual representations. Based on representations from both local and contextual scopes, UGCL optimises a very simple contrastive loss function for graph representation learning. Essentially, the architecture of UGCL can be considered as a general framework to unify existing GCL methods. We have conducted intensive experiments and achieved new state-of-the-art performance in six out of eight benchmark datasets compared with self-supervised graph representation learning baselines. Our code has been open-sourced.

READ FULL TEXT
research
09/22/2020

Sub-graph Contrast for Scalable Self-Supervised Graph Representation Learning

Graph representation learning has attracted lots of attention recently. ...
research
02/09/2023

Self-Supervised Node Representation Learning via Node-to-Neighbourhood Alignment

Self-supervised node representation learning aims to learn node represen...
research
03/23/2022

Node Representation Learning in Graph via Node-to-Neighbourhood Mutual Information Maximization

The key towards learning informative node representations in graphs lies...
research
08/31/2023

Contrastive Representation Learning Based on Multiple Node-centered Subgraphs

As the basic element of graph-structured data, node has been recognized ...
research
06/03/2022

Rethinking and Scaling Up Graph Contrastive Learning: An Extremely Efficient Approach with Group Discrimination

Graph contrastive learning (GCL) alleviates the heavy reliance on label ...
research
03/09/2023

Distortion-Disentangled Contrastive Learning

Self-supervised learning is well known for its remarkable performance in...
research
06/17/2022

MET: Masked Encoding for Tabular Data

We consider the task of self-supervised representation learning (SSL) fo...

Please sign up or login with your details

Forgot password? Click here to reset