From Canonical Correlation Analysis to Self-supervised Graph Neural Networks

06/23/2021
by   Hengrui Zhang, et al.
45

We introduce a conceptually simple yet effective model for self-supervised representation learning with graph data. It follows the previous methods that generate two views of an input graph through data augmentation. However, unlike contrastive methods that focus on instance-level discrimination, we optimize an innovative feature-level objective inspired by classical Canonical Correlation Analysis. Compared with other works, our approach requires none of the parameterized mutual information estimator, additional projector, asymmetric structures, and most importantly, negative samples which can be costly. We show that the new objective essentially 1) aims at discarding augmentation-variant information by learning invariant representations, and 2) can prevent degenerated solutions by decorrelating features in different dimensions. Our theoretical analysis further provides an understanding for the new objective which can be equivalently seen as an instantiation of the Information Bottleneck Principle under the self-supervised setting. Despite its simplicity, our method performs competitively on seven public graph datasets.

READ FULL TEXT

page 21

page 22

research
03/15/2023

RGI : Regularized Graph Infomax for self-supervised learning on graphs

Self-supervised learning is gaining considerable attention as a solution...
research
01/11/2022

Bootstrapping Informative Graph Augmentation via A Meta Learning Approach

Recent works explore learning graph representations in a self-supervised...
research
11/10/2020

Self-supervised Graph Representation Learning via Bootstrapping

Graph neural networks (GNNs) apply deep learning techniques to graph-str...
research
10/15/2022

Augmentation-Free Graph Contrastive Learning of Invariant-Discriminative Representations

The pretasks are mainly built on mutual information estimation, which re...
research
05/22/2022

GraphMAE: Self-Supervised Masked Graph Autoencoders

Self-supervised learning (SSL) has been extensively explored in recent y...
research
09/01/2022

Self-supervised Representation Learning on Electronic Health Records with Graph Kernel Infomax

Learning Electronic Health Records (EHRs) representation is a preeminent...
research
06/03/2022

Rethinking and Scaling Up Graph Contrastive Learning: An Extremely Efficient Approach with Group Discrimination

Graph contrastive learning (GCL) alleviates the heavy reliance on label ...

Please sign up or login with your details

Forgot password? Click here to reset