Enhancing Graph Contrastive Learning with Node Similarity

08/13/2022
by   Hongliang Chi, et al.
7

Graph Neural Networks (GNNs) have achieved great success in learning graph representations and thus facilitating various graph-related tasks. However, most GNN methods adopt a supervised learning setting, which is not always feasible in real-world applications due to the difficulty to obtain labeled data. Hence, graph self-supervised learning has been attracting increasing attention. Graph contrastive learning (GCL) is a representative framework for self-supervised learning. In general, GCL learns node representations by contrasting semantically similar nodes (positive samples) and dissimilar nodes (negative samples) with anchor nodes. Without access to labels, positive samples are typically generated by data augmentation, and negative samples are uniformly sampled from the entire graph, which leads to a sub-optimal objective. Specifically, data augmentation naturally limits the number of positive samples that involve in the process (typically only one positive sample is adopted). On the other hand, the random sampling process would inevitably select false-negative samples (samples sharing the same semantics with the anchor). These issues limit the learning capability of GCL. In this work, we propose an enhanced objective that addresses the aforementioned issues. We first introduce an unachievable ideal objective that contains all positive samples and no false-negative samples. This ideal objective is then transformed into a probabilistic form based on the distributions for sampling positive and negative samples. We then model these distributions with node similarity and derive the enhanced objective. Comprehensive experiments on various datasets demonstrate the effectiveness of the proposed enhanced objective under different settings.

READ FULL TEXT
research
04/06/2023

Synthetic Hard Negative Samples for Contrastive Learning

Contrastive learning has emerged as an essential approach for self-super...
research
04/04/2022

GraFN: Semi-Supervised Node Classification on Graph with Few Labels via Non-Parametric Distribution Assignment

Despite the success of Graph Neural Networks (GNNs) on various applicati...
research
04/24/2023

Hierarchical Contrastive Learning Enhanced Heterogeneous Graph Neural Network

Heterogeneous graph neural networks (HGNNs) as an emerging technique hav...
research
09/28/2022

Non-contrastive approaches to similarity learning: positive examples are all you need

The similarity learning problem in the oil & gas industry aims to constr...
research
06/15/2021

Evaluating Modules in Graph Contrastive Learning

The recent emergence of contrastive learning approaches facilitates the ...
research
01/05/2023

Learning by Sorting: Self-supervised Learning with Group Ordering Constraints

Contrastive learning has become a prominent ingredient in learning repre...
research
02/19/2023

Pseudo Contrastive Learning for Graph-based Semi-supervised Learning

Pseudo Labeling is a technique used to improve the performance of semi-s...

Please sign up or login with your details

Forgot password? Click here to reset