Evaluating Modules in Graph Contrastive Learning

06/15/2021
by   Ganqu Cui, et al.
16

The recent emergence of contrastive learning approaches facilitates the research on graph representation learning (GRL), introducing graph contrastive learning (GCL) into the literature. These methods contrast semantically similar and dissimilar sample pairs to encode the semantics into node or graph embeddings. However, most existing works only performed model-level evaluation, and did not explore the combination space of modules for more comprehensive and systematic studies. For effective module-level evaluation, we propose a framework that decomposes GCL models into four modules: (1) a sampler to generate anchor, positive and negative data samples (nodes or graphs); (2) an encoder and a readout function to get sample embeddings; (3) a discriminator to score each sample pair (anchor-positive and anchor-negative); and (4) an estimator to define the loss function. Based on this framework, we conduct controlled experiments over a wide range of architectural designs and hyperparameter settings on node and graph classification tasks. Specifically, we manage to quantify the impact of a single module, investigate the interaction between modules, and compare the overall performance with current model architectures. Our key findings include a set of module-level guidelines for GCL, e.g., simple samplers from LINE and DeepWalk are strong and robust; an MLP encoder associated with Sum readout could achieve competitive performance on graph classification. Finally, we release our implementations and results as OpenGCL, a modularized toolkit that allows convenient reproduction, standard model and module evaluation, and easy extension.

READ FULL TEXT

page 7

page 17

page 18

research
08/15/2022

ARIEL: Adversarial Graph Contrastive Learning

Contrastive learning is an effective unsupervised method in graph repres...
research
09/08/2021

Graph-MVP: Multi-View Prototypical Contrastive Learning for Multiplex Graphs

Contrastive Learning (CL) is one of the most popular self-supervised lea...
research
03/24/2022

GraphCoCo: Graph Complementary Contrastive Learning

Graph Contrastive Learning (GCL) has shown promising performance in grap...
research
08/13/2022

Enhancing Graph Contrastive Learning with Node Similarity

Graph Neural Networks (GNNs) have achieved great success in learning gra...
research
07/04/2022

Positive-Negative Equal Contrastive Loss for Semantic Segmentation

The contextual information is critical for various computer vision tasks...
research
07/26/2023

Entropy Neural Estimation for Graph Contrastive Learning

Contrastive learning on graphs aims at extracting distinguishable high-l...
research
06/20/2023

Contrastive Disentangled Learning on Graph for Node Classification

Contrastive learning methods have attracted considerable attention due t...

Please sign up or login with your details

Forgot password? Click here to reset