MentorGNN: Deriving Curriculum for Pre-Training GNNs

08/21/2022
by   Dawei Zhou, et al.
13

Graph pre-training strategies have been attracting a surge of attention in the graph mining community, due to their flexibility in parameterizing graph neural networks (GNNs) without any label information. The key idea lies in encoding valuable information into the backbone GNNs, by predicting the masked graph signals extracted from the input graphs. In order to balance the importance of diverse graph signals (e.g., nodes, edges, subgraphs), the existing approaches are mostly hand-engineered by introducing hyperparameters to re-weight the importance of graph signals. However, human interventions with sub-optimal hyperparameters often inject additional bias and deteriorate the generalization performance in the downstream applications. This paper addresses these limitations from a new perspective, i.e., deriving curriculum for pre-training GNNs. We propose an end-to-end model named MentorGNN that aims to supervise the pre-training process of GNNs across graphs with diverse structures and disparate feature spaces. To comprehend heterogeneous graph signals at different granularities, we propose a curriculum learning paradigm that automatically re-weighs graph signals in order to ensure a good generalization in the target domain. Moreover, we shed new light on the problem of domain adaption on relational data (i.e., graphs) by deriving a natural and interpretable upper bound on the generalization error of the pre-trained GNNs. Extensive experiments on a wealth of real graphs validate and verify the performance of MentorGNN.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/27/2019

Pre-train and Learn: Preserve Global Information for Graph Neural Networks

Graph neural networks (GNNs) have shown great power in learning on attri...
research
05/29/2019

Pre-training Graph Neural Networks

Many applications of machine learning in science and medicine, including...
research
05/31/2019

Pre-Training Graph Neural Networks for Generic Structural Feature Extraction

Graph neural networks (GNNs) are shown to be successful in modeling appl...
research
09/15/2022

DiP-GNN: Discriminative Pre-Training of Graph Neural Networks

Graph neural network (GNN) pre-training methods have been proposed to en...
research
04/19/2023

AdapterGNN: Efficient Delta Tuning Improves Generalization Ability in Graph Neural Networks

Fine-tuning pre-trained models has recently yielded remarkable performan...
research
01/26/2023

WL meet VC

Recently, many works studied the expressive power of graph neural networ...
research
12/09/2022

Augmenting Knowledge Transfer across Graphs

Given a resource-rich source graph and a resource-scarce target graph, h...

Please sign up or login with your details

Forgot password? Click here to reset