GraphPrompt: Unifying Pre-Training and Downstream Tasks for Graph Neural Networks

02/16/2023
by   Zemin Liu, et al.
0

Graphs can model complex relationships between objects, enabling a myriad of Web applications such as online page/article classification and social recommendation. While graph neural networks(GNNs) have emerged as a powerful tool for graph representation learning, in an end-to-end supervised setting, their performance heavily rely on a large amount of task-specific supervision. To reduce labeling requirement, the "pre-train, fine-tune" and "pre-train, prompt" paradigms have become increasingly common. In particular, prompting is a popular alternative to fine-tuning in natural language processing, which is designed to narrow the gap between pre-training and downstream objectives in a task-specific manner. However, existing study of prompting on graphs is still limited, lacking a universal treatment to appeal to different downstream tasks. In this paper, we propose GraphPrompt, a novel pre-training and prompting framework on graphs. GraphPrompt not only unifies pre-training and downstream tasks into a common task template, but also employs a learnable prompt to assist a downstream task in locating the most relevant knowledge from the pre-train model in a task-specific manner. Finally, we conduct extensive experiments on five public datasets to evaluate and analyze GraphPrompt.

READ FULL TEXT

page 2

page 5

page 8

research
08/19/2023

Voucher Abuse Detection with Prompt-based Fine-tuning on Graph Neural Networks

Voucher abuse detection is an important anomaly detection problem in E-c...
research
06/17/2020

GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training

Graph representation learning has emerged as a powerful technique for re...
research
06/05/2023

Graph-Aware Language Model Pre-Training on a Large Graph Corpus Can Help Multiple Graph Applications

Model pre-training on large text corpora has been demonstrated effective...
research
10/08/2021

RPT: Toward Transferable Model on Heterogeneous Researcher Data via Pre-Training

With the growth of the academic engines, the mining and analysis acquisi...
research
07/07/2020

Exploring Heterogeneous Information Networks via Pre-Training

To explore heterogeneous information networks (HINs), network representa...
research
07/04/2023

All in One: Multi-task Prompting for Graph Neural Networks

Recently, ”pre-training and fine-tuning” has been adopted as a standard ...
research
05/03/2023

Learngene: Inheriting Condensed Knowledge from the Ancestry Model to Descendant Models

During the continuous evolution of one organism's ancestry, its genes ac...

Please sign up or login with your details

Forgot password? Click here to reset