Bootstrapped Representation Learning on Graphs

02/12/2021
by   Shantanu Thakoor, et al.
0

Current state-of-the-art self-supervised learning methods for graph neural networks (GNNs) are based on contrastive learning. As such, they heavily depend on the construction of augmentations and negative examples. For example, on the standard PPI benchmark, increasing the number of negative pairs improves performance, thereby requiring computation and memory cost quadratic in the number of nodes to achieve peak performance. Inspired by BYOL, a recently introduced method for self-supervised learning that does not require negative pairs, we present Bootstrapped Graph Latents, BGRL, a self-supervised graph representation method that gets rid of this potentially quadratic bottleneck. BGRL outperforms or matches the previous unsupervised state-of-the-art results on several established benchmark datasets. Moreover, it enables the effective usage of graph attentional (GAT) encoders, allowing us to further improve the state of the art. In particular on the PPI dataset, using GAT as an encoder we achieve state-of-the-art 70.49 On all other datasets under consideration, our model is competitive with the equivalent supervised GNN results, often exceeding them.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/10/2020

Self-supervised Graph Representation Learning via Bootstrapping

Graph neural networks (GNNs) apply deep learning techniques to graph-str...
research
10/08/2019

Learning event representations in image sequences by dynamic graph embedding

Recently, self-supervised learning has proved to be effective to learn r...
research
01/10/2022

Cross-view Self-Supervised Learning on Heterogeneous Graph Neural Network via Bootstrapping

Heterogeneous graph neural networks can represent information of heterog...
research
03/20/2022

Inspection-L: A Self-Supervised GNN-Based Money Laundering Detection System for Bitcoin

Criminals have become increasingly experienced in using cryptocurrencies...
research
06/17/2022

MET: Masked Encoding for Tabular Data

We consider the task of self-supervised representation learning (SSL) fo...
research
06/03/2022

Rethinking and Scaling Up Graph Contrastive Learning: An Extremely Efficient Approach with Group Discrimination

Graph contrastive learning (GCL) alleviates the heavy reliance on label ...
research
10/18/2021

TLDR: Twin Learning for Dimensionality Reduction

Dimensionality reduction methods are unsupervised approaches which learn...

Please sign up or login with your details

Forgot password? Click here to reset