Automated Self-Supervised Learning for Graphs

06/10/2021
by   Wei Jin, et al.
10

Graph self-supervised learning has gained increasing attention due to its capacity to learn expressive node representations. Many pretext tasks, or loss functions have been designed from distinct perspectives. However, we observe that different pretext tasks affect downstream tasks differently cross datasets, which suggests that searching pretext tasks is crucial for graph self-supervised learning. Different from existing works focusing on designing single pretext tasks, this work aims to investigate how to automatically leverage multiple pretext tasks effectively. Nevertheless, evaluating representations derived from multiple pretext tasks without direct access to ground truth labels makes this problem challenging. To address this obstacle, we make use of a key principle of many real-world graphs, i.e., homophily, or the principle that “like attracts like,” as the guidance to effectively search various self-supervised pretext tasks. We provide theoretical understanding and empirical evidence to justify the flexibility of homophily in this search task. Then we propose the AutoSSL framework which can automatically search over combinations of various self-supervised tasks. By evaluating the framework on 7 real-world datasets, our experimental results show that AutoSSL can significantly boost the performance on downstream tasks including node clustering and node classification compared with training under individual tasks. Code will be released at https://github.com/ChandlerBang/AutoSSL.

READ FULL TEXT
research
06/07/2022

Decoupled Self-supervised Learning for Non-Homophilous Graphs

In this paper, we study the problem of conducting self-supervised learni...
research
12/05/2021

Augmentation-Free Self-Supervised Learning on Graphs

Inspired by the recent success of self-supervised methods applied on ima...
research
02/07/2022

Graph Self-supervised Learning with Accurate Discrepancy Learning

Self-supervised learning of graph neural networks (GNNs) aims to learn a...
research
12/13/2020

InferCode: Self-Supervised Learning of Code Representations by Predicting Subtrees

Building deep learning models on source code has found many successful s...
research
03/26/2021

Quantum Self-Supervised Learning

The popularisation of neural networks has seen incredible advances in pa...
research
03/03/2023

Towards Democratizing Joint-Embedding Self-Supervised Learning

Joint Embedding Self-Supervised Learning (JE-SSL) has seen rapid develop...
research
10/02/2022

What shapes the loss landscape of self-supervised learning?

Prevention of complete and dimensional collapse of representations has r...

Please sign up or login with your details

Forgot password? Click here to reset