Online Cross-Layer Knowledge Distillation on Graph Neural Networks with Deep Supervision

10/25/2022
by   Jiongyu Guo, et al.
0

Graph neural networks (GNNs) have become one of the most popular research topics in both academia and industry communities for their strong ability in handling irregular graph data. However, large-scale datasets are posing great challenges for deploying GNNs in edge devices with limited resources and model compression techniques have drawn considerable research attention. Existing model compression techniques such as knowledge distillation (KD) mainly focus on convolutional neural networks (CNNs). Only limited attempts have been made recently for distilling knowledge from GNNs in an offline manner. As the performance of the teacher model does not necessarily improve as the number of layers increases in GNNs, selecting an appropriate teacher model will require substantial efforts. To address these challenges, we propose a novel online knowledge distillation framework called Alignahead++ in this paper. Alignahead++ transfers structure and feature information in a student layer to the previous layer of another simultaneously trained student model in an alternating training procedure. Meanwhile, to avoid over-smoothing problem in GNNs, deep supervision is employed in Alignahead++ by adding an auxiliary classifier in each intermediate layer to prevent the collapse of the node feature embeddings. Experimental results on four datasets including PPI, Cora, PubMed and CiteSeer demonstrate that the student performance is consistently boosted in our collaborative training framework without the supervision of a pre-trained teacher model and its effectiveness can generally be improved by increasing the number of students.

READ FULL TEXT
research
05/05/2022

Alignahead: Online Cross-Layer Knowledge Extraction on Graph Neural Networks

Existing knowledge distillation methods on graph neural networks (GNNs) ...
research
05/08/2022

Data-Free Adversarial Knowledge Distillation for Graph Neural Networks

Graph neural networks (GNNs) have been widely used in modeling graph str...
research
09/02/2020

Self-supervised Smoothing Graph Neural Networks

This paper studies learning node representations with GNNs for unsupervi...
research
02/01/2023

Knowledge Distillation on Graphs: A Survey

Graph Neural Networks (GNNs) have attracted tremendous attention by demo...
research
12/28/2021

Online Adversarial Distillation for Graph Neural Networks

Knowledge distillation has recently become a popular technique to improv...
research
06/26/2023

Accelerating Molecular Graph Neural Networks via Knowledge Distillation

Recent advances in graph neural networks (GNNs) have allowed molecular s...
research
06/10/2022

Transformer-Graph Neural Network with Global-Local Attention for Multimodal Rumour Detection with Knowledge Distillation

Misinformation spreading becomes a critical issue in online conversation...

Please sign up or login with your details

Forgot password? Click here to reset