Data-Free Adversarial Knowledge Distillation for Graph Neural Networks

05/08/2022
by   Yuanxin Zhuang, et al.
0

Graph neural networks (GNNs) have been widely used in modeling graph structured data, owing to its impressive performance in a wide range of practical applications. Recently, knowledge distillation (KD) for GNNs has enabled remarkable progress in graph model compression and knowledge transfer. However, most of the existing KD methods require a large volume of real data, which are not readily available in practice, and may preclude their applicability in scenarios where the teacher model is trained on rare or hard to acquire datasets. To address this problem, we propose the first end-to-end framework for data-free adversarial knowledge distillation on graph structured data (DFAD-GNN). To be specific, our DFAD-GNN employs a generative adversarial network, which mainly consists of three components: a pre-trained teacher model and a student model are regarded as two discriminators, and a generator is utilized for deriving training graphs to distill knowledge from the teacher model into the student model. Extensive experiments on various benchmark models and six representative datasets demonstrate that our DFAD-GNN significantly surpasses state-of-the-art data-free baselines in the graph classification task.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/24/2022

Geometric Knowledge Distillation: Topology Compression for Graph Neural Networks

We study a new paradigm of knowledge transfer that aims at encoding grap...
research
11/04/2020

On Self-Distilling Graph Neural Network

Recently, the teacher-student knowledge distillation framework has demon...
research
10/25/2022

Online Cross-Layer Knowledge Distillation on Graph Neural Networks with Deep Supervision

Graph neural networks (GNNs) have become one of the most popular researc...
research
06/26/2023

Accelerating Molecular Graph Neural Networks via Knowledge Distillation

Recent advances in graph neural networks (GNNs) have allowed molecular s...
research
04/20/2023

Train Your Own GNN Teacher: Graph-Aware Distillation on Textual Graphs

How can we learn effective node representations on textual graphs? Graph...
research
08/05/2022

PGX: A Multi-level GNN Explanation Framework Based on Separate Knowledge Distillation Processes

Graph Neural Networks (GNNs) are widely adopted in advanced AI systems d...
research
11/07/2020

Robustness and Diversity Seeking Data-Free Knowledge Distillation

Knowledge distillation (KD) has enabled remarkable progress in model com...

Please sign up or login with your details

Forgot password? Click here to reset