Self-Supervised Node Representation Learning via Node-to-Neighbourhood Alignment

02/09/2023
by   Wei Dong, et al.
0

Self-supervised node representation learning aims to learn node representations from unlabelled graphs that rival the supervised counterparts. The key towards learning informative node representations lies in how to effectively gain contextual information from the graph structure. In this work, we present simple-yet-effective self-supervised node representation learning via aligning the hidden representations of nodes and their neighbourhood. Our first idea achieves such node-to-neighbourhood alignment by directly maximizing the mutual information between their representations, which, we prove theoretically, plays the role of graph smoothing. Our framework is optimized via a surrogate contrastive loss and a Topology-Aware Positive Sampling (TAPS) strategy is proposed to sample positives by considering the structural dependencies between nodes, which enables offline positive selection. Considering the excessive memory overheads of contrastive learning, we further propose a negative-free solution, where the main contribution is a Graph Signal Decorrelation (GSD) constraint to avoid representation collapse and over-smoothing. The GSD constraint unifies some of the existing constraints and can be used to derive new implementations to combat representation collapse. By applying our methods on top of simple MLP-based node representation encoders, we learn node representations that achieve promising node classification performance on a set of graph-structured datasets from small- to large-scale.

READ FULL TEXT

page 6

page 9

page 18

research
03/23/2022

Node Representation Learning in Graph via Node-to-Neighbourhood Mutual Information Maximization

The key towards learning informative node representations in graphs lies...
research
12/16/2021

Self-Supervised Dynamic Graph Representation Learning via Temporal Subgraph Contrast

Self-supervised learning on graphs has recently drawn a lot of attention...
research
10/17/2022

Unifying Graph Contrastive Learning with Flexible Contextual Scopes

Graph contrastive learning (GCL) has recently emerged as an effective le...
research
06/03/2022

Rethinking and Scaling Up Graph Contrastive Learning: An Extremely Efficient Approach with Group Discrimination

Graph contrastive learning (GCL) alleviates the heavy reliance on label ...
research
03/03/2020

Self-Supervised Graph Representation Learning via Global Context Prediction

To take full advantage of fast-growing unlabeled networked data, this pa...
research
12/08/2022

Alleviating neighbor bias: augmenting graph self-supervise learning with structural equivalent positive samples

In recent years, using a self-supervised learning framework to learn the...
research
06/09/2023

Intensity Profile Projection: A Framework for Continuous-Time Representation Learning for Dynamic Networks

We present a new algorithmic framework, Intensity Profile Projection, fo...

Please sign up or login with your details

Forgot password? Click here to reset