MA-GCL: Model Augmentation Tricks for Graph Contrastive Learning

12/14/2022
by   Xumeng Gong, et al.
0

Contrastive learning (CL), which can extract the information shared between different contrastive views, has become a popular paradigm for vision representation learning. Inspired by the success in computer vision, recent work introduces CL into graph modeling, dubbed as graph contrastive learning (GCL). However, generating contrastive views in graphs is more challenging than that in images, since we have little prior knowledge on how to significantly augment a graph without changing its labels. We argue that typical data augmentation techniques (e.g., edge dropping) in GCL cannot generate diverse enough contrastive views to filter out noises. Moreover, previous GCL methods employ two view encoders with exactly the same neural architecture and tied parameters, which further harms the diversity of augmented views. To address this limitation, we propose a novel paradigm named model augmented GCL (MA-GCL), which will focus on manipulating the architectures of view encoders instead of perturbing graph inputs. Specifically, we present three easy-to-implement model augmentation tricks for GCL, namely asymmetric, random and shuffling, which can respectively help alleviate high- frequency noises, enrich training instances and bring safer augmentations. All three tricks are compatible with typical data augmentations. Experimental results show that MA-GCL can achieve state-of-the-art performance on node classification benchmarks by applying the three tricks on a simple base model. Extensive studies also validate our motivation and the effectiveness of each trick. (Code, data and appendix are available at https://github.com/GXM1141/MA-GCL. )

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/29/2022

RoSA: A Robust Self-Aligned Framework for Node-Node Graph Contrastive Learning

Graph contrastive learning has gained significant progress recently. How...
research
05/03/2022

Contrastive Learning for Prompt-Based Few-Shot Language Learners

The impressive performance of GPT-3 using natural language prompts and i...
research
10/07/2022

Augmentations in Hypergraph Contrastive Learning: Fabricated and Generative

This paper targets at improving the generalizability of hypergraph neura...
research
06/16/2023

HomoGCL: Rethinking Homophily in Graph Contrastive Learning

Contrastive learning (CL) has become the de-facto learning paradigm in s...
research
06/06/2023

Randomized Schur Complement Views for Graph Contrastive Learning

We introduce a randomized topological augmentor based on Schur complemen...
research
01/04/2022

Bringing Your Own View: Graph Contrastive Learning without Prefabricated Data Augmentations

Self-supervision is recently surging at its new frontier of graph learni...
research
05/11/2022

Simple Contrastive Graph Clustering

Contrastive learning has recently attracted plenty of attention in deep ...

Please sign up or login with your details

Forgot password? Click here to reset