Contrastive Learning under Heterophily

03/11/2023
by   Wenhan Yang, et al.
0

Graph Neural Networks are powerful tools for learning node representations when task-specific node labels are available. However, obtaining labels for graphs is expensive in many applications. This is particularly the case for large graphs. To address this, there has been a body of work to learn node representations in a self-supervised manner without labels. Contrastive learning (CL), has been particularly popular to learn representations in a self-supervised manner. In general, CL methods work by maximizing the similarity between representations of augmented views of the same example, and minimizing the similarity between augmented views of different examples. However, existing graph CL methods cannot learn high-quality representations under heterophily, where connected nodes tend to belong to different classes. This is because under heterophily, augmentations of the same example may not be similar to each other. In this work, we address the above problem by proposing the first graph CL method, HLCL, for learning node representations, under heterophily. HLCL uses a high-pass and a low-pass graph filter to generate different views of the same node. Then, it contrasts the two filtered views to learn the final node representations. Effectively, the high-pass filter captures the dissimilarity between nodes in a neighborhood and the low-pass filter captures the similarity between neighboring nodes.Contrasting the two filtered views allows HLCL to learn rich node representations for graphs, under heterophily and homophily.Empirically, HLCL outperforms state-of-the-art graph CL methods on benchmark heterophily datasets and large-scale real-world datasets by up to 10

READ FULL TEXT
research
07/15/2020

GraphCL: Contrastive Self-Supervised Learning of Graph Representations

We propose Graph Contrastive Learning (GraphCL), a general framework for...
research
04/20/2023

ID-MixGCL: Identity Mixup for Graph Contrastive Learning

Recently developed graph contrastive learning (GCL) approaches compare t...
research
06/07/2021

Self-Supervised Graph Learning with Proximity-based Views and Channel Contrast

We consider graph representation learning in a self-supervised manner. G...
research
10/28/2021

Graph Communal Contrastive Learning

Graph representation learning is crucial for many real-world application...
research
04/30/2022

Heterogeneous Graph Neural Networks using Self-supervised Reciprocally Contrastive Learning

Heterogeneous graph neural network (HGNN) is a very popular technique fo...
research
11/20/2021

Towards Graph Self-Supervised Learning with Contrastive Adjusted Zooming

Graph representation learning (GRL) is critical for graph-structured dat...
research
05/23/2023

ConGraT: Self-Supervised Contrastive Pretraining for Joint Graph and Text Embeddings

We propose ConGraT(Contrastive Graph-Text pretraining), a general, self-...

Please sign up or login with your details

Forgot password? Click here to reset