Self-Supervised Representation Learning via Latent Graph Prediction

02/16/2022
by   Yaochen Xie, et al.
38

Self-supervised learning (SSL) of graph neural networks is emerging as a promising way of leveraging unlabeled data. Currently, most methods are based on contrastive learning adapted from the image domain, which requires view generation and a sufficient number of negative samples. In contrast, existing predictive models do not require negative sampling, but lack theoretical guidance on the design of pretext training tasks. In this work, we propose the LaGraph, a theoretically grounded predictive SSL framework based on latent graph prediction. Learning objectives of LaGraph are derived as self-supervised upper bounds to objectives for predicting unobserved latent graphs. In addition to its improved performance, LaGraph provides explanations for recent successes of predictive models that include invariance-based objectives. We provide theoretical analysis comparing LaGraph to related methods in different domains. Our experimental results demonstrate the superiority of LaGraph in performance and the robustness to decreasing of training sample size on both graph-level and node-level tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/26/2022

Latent Augmentation For Better Graph Self-Supervised Learning

Graph self-supervised learning has been vastly employed to learn represe...
research
06/07/2022

Decoupled Self-supervised Learning for Non-Homophilous Graphs

In this paper, we study the problem of conducting self-supervised learni...
research
09/17/2020

AAG: Self-Supervised Representation Learning by Auxiliary Augmentation with GNT-Xent Loss

Self-supervised representation learning is an emerging research topic fo...
research
07/18/2023

MOCA: Self-supervised Representation Learning by Predicting Masked Online Codebook Assignments

Self-supervised learning can be used for mitigating the greedy needs of ...
research
11/15/2018

SGR: Self-Supervised Spectral Graph Representation Learning

Representing a graph as a vector is a challenging task; ideally, the rep...
research
05/30/2023

A Graph is Worth 1-bit Spikes: When Graph Contrastive Learning Meets Spiking Neural Networks

While contrastive self-supervised learning has become the de-facto learn...
research
08/30/2022

A Self-supervised Riemannian GNN with Time Varying Curvature for Temporal Graph Learning

Representation learning on temporal graphs has drawn considerable resear...

Please sign up or login with your details

Forgot password? Click here to reset