Geometric Knowledge Distillation: Topology Compression for Graph Neural Networks

10/24/2022
by   Chenxiao Yang, et al.
0

We study a new paradigm of knowledge transfer that aims at encoding graph topological information into graph neural networks (GNNs) by distilling knowledge from a teacher GNN model trained on a complete graph to a student GNN model operating on a smaller or sparser graph. To this end, we revisit the connection between thermodynamics and the behavior of GNN, based on which we propose Neural Heat Kernel (NHK) to encapsulate the geometric property of the underlying manifold concerning the architecture of GNNs. A fundamental and principled solution is derived by aligning NHKs on teacher and student models, dubbed as Geometric Knowledge Distillation. We develop non- and parametric instantiations and demonstrate their efficacy in various experimental settings for knowledge distillation regarding different types of privileged topological information and teacher-student schemes.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/03/2023

RELIANT: Fair Knowledge Distillation for Graph Neural Networks

Graph Neural Networks (GNNs) have shown satisfying performance on variou...
research
05/08/2022

Data-Free Adversarial Knowledge Distillation for Graph Neural Networks

Graph neural networks (GNNs) have been widely used in modeling graph str...
research
05/16/2021

Graph-Free Knowledge Distillation for Graph Neural Networks

Knowledge distillation (KD) transfers knowledge from a teacher network t...
research
10/12/2022

Boosting Graph Neural Networks via Adaptive Knowledge Distillation

Graph neural networks (GNNs) have shown remarkable performance on divers...
research
11/09/2021

On Representation Knowledge Distillation for Graph Neural Networks

Knowledge distillation is a promising learning paradigm for boosting the...
research
04/20/2023

Train Your Own GNN Teacher: Graph-Aware Distillation on Textual Graphs

How can we learn effective node representations on textual graphs? Graph...
research
02/27/2023

Graph-based Knowledge Distillation: A survey and experimental evaluation

Graph, such as citation networks, social networks, and transportation ne...

Please sign up or login with your details

Forgot password? Click here to reset