Distilling Holistic Knowledge with Graph Neural Networks

08/12/2021
by   Sheng Zhou, et al.
0

Knowledge Distillation (KD) aims at transferring knowledge from a larger well-optimized teacher network to a smaller learnable student network.Existing KD methods have mainly considered two types of knowledge, namely the individual knowledge and the relational knowledge. However, these two types of knowledge are usually modeled independently while the inherent correlations between them are largely ignored. It is critical for sufficient student network learning to integrate both individual knowledge and relational knowledge while reserving their inherent correlation. In this paper, we propose to distill the novel holistic knowledge based on an attributed graph constructed among instances. The holistic knowledge is represented as a unified graph-based embedding by aggregating individual knowledge from relational neighborhood samples with graph neural networks, the student network is learned by distilling the holistic knowledge in a contrastive manner. Extensive experiments and ablation studies are conducted on benchmark datasets, the results demonstrate the effectiveness of the proposed method. The code has been published in https://github.com/wyc-ruiker/HKD

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/10/2019

Relational Knowledge Distillation

Knowledge distillation aims at transferring knowledge acquired in one mo...
research
02/27/2023

Graph-based Knowledge Distillation: A survey and experimental evaluation

Graph, such as citation networks, social networks, and transportation ne...
research
04/14/2022

Cross-Image Relational Knowledge Distillation for Semantic Segmentation

Current Knowledge Distillation (KD) methods for semantic segmentation of...
research
10/12/2022

Boosting Graph Neural Networks via Adaptive Knowledge Distillation

Graph neural networks (GNNs) have shown remarkable performance on divers...
research
03/04/2021

Extract the Knowledge of Graph Neural Networks and Go Beyond it: An Effective Knowledge Distillation Framework

Semi-supervised learning on graphs is an important problem in the machin...
research
07/25/2022

HIRE: Distilling High-order Relational Knowledge From Heterogeneous Graph Neural Networks

Researchers have recently proposed plenty of heterogeneous graph neural ...
research
04/28/2021

Interpretable Embedding Procedure Knowledge Transfer via Stacked Principal Component Analysis and Graph Neural Network

Knowledge distillation (KD) is one of the most useful techniques for lig...

Please sign up or login with your details

Forgot password? Click here to reset