Distillating Knowledge from Graph Convolutional Networks

03/23/2020
by   Yiding Yang, et al.
70

Existing knowledge distillation methods focus on convolutional neural networks (CNNs), where the input samples like images lie in a grid domain, and have largely overlooked graph convolutional networks (GCN) that handle non-grid data. In this paper, we propose to our best knowledge the first dedicated approach to distilling knowledge from a pre-trained GCN model. To enable the knowledge transfer from the teacher GCN to the student, we propose a local structure preserving module that explicitly accounts for the topological semantics of the teacher. In this module, the local structure information from both the teacher and the student are extracted as distributions, and hence minimizing the distance between these distributions enables topology-aware knowledge transfer from the teacher, yielding a compact yet high-performance student model. Moreover, the proposed approach is readily extendable to dynamic graph models, where the input graphs for the teacher and the student may differ. We evaluate the proposed method on two different datasets using GCN models of different architectures, and demonstrate that our method achieves the state-of-the-art knowledge distillation performance for GCN models.

READ FULL TEXT

page 1

page 7

research
03/23/2020

Distilling Knowledge from Graph Convolutional Networks

Existing knowledge distillation methods focus on convolutional neural ne...
research
05/16/2021

Graph-Free Knowledge Distillation for Graph Neural Networks

Knowledge distillation (KD) transfers knowledge from a teacher network t...
research
06/01/2022

ORC: Network Group-based Knowledge Distillation using Online Role Change

In knowledge distillation, since a single, omnipotent teacher network ca...
research
11/01/2020

Dark Reciprocal-Rank: Boosting Graph-Convolutional Self-Localization Network via Teacher-to-student Knowledge Transfer

In visual robot self-localization, graph-based scene representation and ...
research
09/26/2019

Graph-Preserving Grid Layout: A Simple Graph Drawing Method for Graph Classification using CNNs

Graph convolutional networks (GCNs) suffer from the irregularity of grap...
research
12/28/2021

Online Adversarial Distillation for Graph Neural Networks

Knowledge distillation has recently become a popular technique to improv...
research
07/25/2022

HIRE: Distilling High-order Relational Knowledge From Heterogeneous Graph Neural Networks

Researchers have recently proposed plenty of heterogeneous graph neural ...

Please sign up or login with your details

Forgot password? Click here to reset