Search to Fine-tune Pre-trained Graph Neural Networks for Graph-level Tasks

08/14/2023
by   Zhili Wang, et al.
0

Recently, graph neural networks (GNNs) have shown its unprecedented success in many graph-related tasks. However, GNNs face the label scarcity issue as other neural networks do. Thus, recent efforts try to pre-train GNNs on a large-scale unlabeled graph and adapt the knowledge from the unlabeled graph to the target downstream task. The adaptation is generally achieved by fine-tuning the pre-trained GNNs with a limited number of labeled data. Despite the importance of fine-tuning, current GNNs pre-training works often ignore designing a good fine-tuning strategy to better leverage transferred knowledge and improve the performance on downstream tasks. Only few works start to investigate a better fine-tuning strategy for pre-trained GNNs. But their designs either have strong assumptions or overlook the data-aware issue for various downstream datasets. Therefore, we aim to design a better fine-tuning strategy for pre-trained GNNs to improve the model performance in this paper. Given a pre-trained GNN, we propose to search to fine-tune pre-trained graph neural networks for graph-level tasks (S2PGNN), which adaptively design a suitable fine-tuning framework for the given labeled data on the downstream task. To ensure the improvement brought by searching fine-tuning strategy, we carefully summarize a proper search space of fine-tuning framework that is suitable for GNNs. The empirical studies show that S2PGNN can be implemented on the top of 10 famous pre-trained GNNs and consistently improve their performance. Besides, S2PGNN achieves better performance than existing fine-tuning strategies within and outside the GNN area. Our code is publicly available at <https://anonymous.4open.science/r/code_icde2024-A9CB/>.

READ FULL TEXT
research
04/19/2023

AdapterGNN: Efficient Delta Tuning Improves Generalization Ability in Graph Neural Networks

Fine-tuning pre-trained models has recently yielded remarkable performan...
research
09/30/2022

Prompt Tuning for Graph Neural Networks

In recent years, prompt tuning has set off a research boom in the adapta...
research
07/07/2020

Exploring Heterogeneous Information Networks via Pre-Training

To explore heterogeneous information networks (HINs), network representa...
research
08/19/2023

Voucher Abuse Detection with Prompt-based Fine-tuning on Graph Neural Networks

Voucher abuse detection is an important anomaly detection problem in E-c...
research
03/20/2022

Fine-Tuning Graph Neural Networks via Graph Topology induced Optimal Transport

Recently, the pretrain-finetuning paradigm has attracted tons of attenti...
research
12/20/2022

MolCPT: Molecule Continuous Prompt Tuning to Generalize Molecular Representation Learning

Molecular representation learning is crucial for the problem of molecula...
research
03/29/2023

When to Pre-Train Graph Neural Networks? An Answer from Data Generation Perspective!

Recently, graph pre-training has attracted wide research attention, whic...

Please sign up or login with your details

Forgot password? Click here to reset