HIRE: Distilling High-order Relational Knowledge From Heterogeneous Graph Neural Networks

07/25/2022
by   Jing Liu, et al.
0

Researchers have recently proposed plenty of heterogeneous graph neural networks (HGNNs) due to the ubiquity of heterogeneous graphs in both academic and industrial areas. Instead of pursuing a more powerful HGNN model, in this paper, we are interested in devising a versatile plug-and-play module, which accounts for distilling relational knowledge from pre-trained HGNNs. To the best of our knowledge, we are the first to propose a HIgh-order RElational (HIRE) knowledge distillation framework on heterogeneous graphs, which can significantly boost the prediction performance regardless of model architectures of HGNNs. Concretely, our HIRE framework initially performs first-order node-level knowledge distillation, which encodes the semantics of the teacher HGNN with its prediction logits. Meanwhile, the second-order relation-level knowledge distillation imitates the relational correlation between node embeddings of different types generated by the teacher HGNN. Extensive experiments on various popular HGNNs models and three real-world heterogeneous graphs demonstrate that our method obtains consistent and considerable performance enhancement, proving its effectiveness and generalization ability.

READ FULL TEXT

page 4

page 12

page 13

page 17

page 18

page 19

page 22

research
12/28/2021

Online Adversarial Distillation for Graph Neural Networks

Knowledge distillation has recently become a popular technique to improv...
research
11/04/2020

On Self-Distilling Graph Neural Network

Recently, the teacher-student knowledge distillation framework has demon...
research
11/21/2022

Directed Acyclic Graph Factorization Machines for CTR Prediction via Knowledge Distillation

With the growth of high-dimensional sparse data in web-scale recommender...
research
11/09/2021

On Representation Knowledge Distillation for Graph Neural Networks

Knowledge distillation is a promising learning paradigm for boosting the...
research
03/23/2020

Distilling Knowledge from Graph Convolutional Networks

Existing knowledge distillation methods focus on convolutional neural ne...
research
03/23/2020

Distillating Knowledge from Graph Convolutional Networks

Existing knowledge distillation methods focus on convolutional neural ne...
research
08/12/2021

Distilling Holistic Knowledge with Graph Neural Networks

Knowledge Distillation (KD) aims at transferring knowledge from a larger...

Please sign up or login with your details

Forgot password? Click here to reset