Self-supervised Heterogeneous Graph Pre-training Based on Structural Clustering

10/19/2022
by   Yaming Yang, et al.
7

Recent self-supervised pre-training methods on Heterogeneous Information Networks (HINs) have shown promising competitiveness over traditional semi-supervised Heterogeneous Graph Neural Networks (HGNNs). Unfortunately, their performance heavily depends on careful customization of various strategies for generating high-quality positive examples and negative examples, which notably limits their flexibility and generalization ability. In this work, we present SHGP, a novel Self-supervised Heterogeneous Graph Pre-training approach, which does not need to generate any positive examples or negative examples. It consists of two modules that share the same attention-aggregation scheme. In each iteration, the Att-LPA module produces pseudo-labels through structural clustering, which serve as the self-supervision signals to guide the Att-HGNN module to learn object embeddings and attention coefficients. The two modules can effectively utilize and enhance each other, promoting the model to learn discriminative embeddings. Extensive experiments on four real-world datasets demonstrate the superior effectiveness of SHGP against state-of-the-art unsupervised baselines and even semi-supervised baselines. We release our source code at: https://github.com/kepsail/SHGP.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/19/2021

Self-supervised Heterogeneous Graph Neural Network with Co-contrastive Learning

Heterogeneous graph neural networks (HGNNs) as an emerging technique hav...
research
01/10/2022

Cross-view Self-Supervised Learning on Heterogeneous Graph Neural Network via Bootstrapping

Heterogeneous graph neural networks can represent information of heterog...
research
06/14/2023

Self-supervised Learning and Graph Classification under Heterophily

Self-supervised learning has shown its promising capability in graph rep...
research
03/25/2021

Contrast to Divide: Self-Supervised Pre-Training for Learning with Noisy Labels

The success of learning with noisy labels (LNL) methods relies heavily o...
research
11/19/2021

Dynamic Graph Representation Learning via Graph Transformer Networks

Dynamic graph representation learning is an important task with widespre...
research
04/14/2022

SNP2Vec: Scalable Self-Supervised Pre-Training for Genome-Wide Association Study

Self-supervised pre-training methods have brought remarkable breakthroug...
research
06/15/2023

Pushing the Limits of Unsupervised Unit Discovery for SSL Speech Representation

The excellent generalization ability of self-supervised learning (SSL) f...

Please sign up or login with your details

Forgot password? Click here to reset