Graph-based Knowledge Distillation: A survey and experimental evaluation

02/27/2023
by   Jing Liu, et al.
0

Graph, such as citation networks, social networks, and transportation networks, are prevalent in the real world. Graph Neural Networks (GNNs) have gained widespread attention for their robust expressiveness and exceptional performance in various graph applications. However, the efficacy of GNNs is heavily reliant on sufficient data labels and complex network models, with the former obtaining hardly and the latter computing costly. To address the labeled data scarcity and high complexity of GNNs, Knowledge Distillation (KD) has been introduced to enhance existing GNNs. This technique involves transferring the soft-label supervision of the large teacher model to the small student model while maintaining prediction performance. This survey offers a comprehensive overview of Graph-based Knowledge Distillation methods, systematically categorizing and summarizing them while discussing their limitations and future directions. This paper first introduces the background of graph and KD. It then provides a comprehensive summary of three types of Graph-based Knowledge Distillation methods, namely Graph-based Knowledge Distillation for deep neural networks (DKD), Graph-based Knowledge Distillation for GNNs (GKD), and Self-Knowledge Distillation based Graph-based Knowledge Distillation (SKD). Each type is further divided into knowledge distillation methods based on the output layer, middle layer, and constructed graph. Subsequently, various algorithms' ideas are analyzed and compared, concluding with the advantages and disadvantages of each algorithm supported by experimental results. In addition, the applications of graph-based knowledge distillation in CV, NLP, RS, and other fields are listed. Finally, the graph-based knowledge distillation is summarized and prospectively discussed. We have also released related resources at https://github.com/liujing1023/Graph-based-Knowledge-Distillation.

READ FULL TEXT

page 17

page 25

research
10/24/2022

Geometric Knowledge Distillation: Topology Compression for Graph Neural Networks

We study a new paradigm of knowledge transfer that aims at encoding grap...
research
06/26/2023

Accelerating Molecular Graph Neural Networks via Knowledge Distillation

Recent advances in graph neural networks (GNNs) have allowed molecular s...
research
02/01/2023

Knowledge Distillation on Graphs: A Survey

Graph Neural Networks (GNNs) have attracted tremendous attention by demo...
research
07/04/2019

Graph-based Knowledge Distillation by Multi-head Self-attention Network

Knowledge distillation (KD) is a technique to derive optimal performance...
research
08/12/2021

Distilling Holistic Knowledge with Graph Neural Networks

Knowledge Distillation (KD) aims at transferring knowledge from a larger...
research
07/04/2019

Graph-based Knowledge Distillation by Multi-head Attention Network

Knowledge distillation (KD) is a technique to derive optimal performance...
research
09/02/2020

Self-supervised Smoothing Graph Neural Networks

This paper studies learning node representations with GNNs for unsupervi...

Please sign up or login with your details

Forgot password? Click here to reset