Iterative Graph Self-Distillation

10/23/2020
by   Hanlin Zhang, et al.
2

How to discriminatively vectorize graphs is a fundamental challenge that attracts increasing attentions in recent years. Inspired by the recent success of unsupervised contrastive learning, we aim to learn graph-level representation in an unsupervised manner. Specifically, we propose a novel unsupervised graph learning paradigm called Iterative Graph Self-Distillation (IGSD) which iteratively performs the teacher-student distillation with graph augmentations. Different from conventional knowledge distillation, IGSD constructs the teacher with an exponential moving average of the student model and distills the knowledge of itself. The intuition behind IGSD is to predict the teacher network representation of the graph pairs under different augmented views. As a natural extension, we also apply IGSD to semi-supervised scenarios by jointly regularizing the network with both supervised and unsupervised contrastive loss. Finally, we show that finetuning the IGSD-trained models with self-training can further improve the graph representation power. Empirically, we achieve significant and consistent performance gain on various graph datasets in both unsupervised and semi-supervised settings, which well validates the superiority of IGSD.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/03/2022

A Novel Self-Knowledge Distillation Approach with Siamese Representation Learning for Action Recognition

Knowledge distillation is an effective transfer of knowledge from a heav...
research
03/14/2023

MobileVOS: Real-Time Video Object Segmentation Contrastive Learning meets Knowledge Distillation

This paper tackles the problem of semi-supervised video object segmentat...
research
04/08/2021

GKD: Semi-supervised Graph Knowledge Distillation for Graph-Independent Inference

The increased amount of multi-modal medical data has opened the opportun...
research
12/14/2022

Establishing a stronger baseline for lightweight contrastive models

Recent research has reported a performance degradation in self-supervise...
research
01/06/2022

Contrastive Neighborhood Alignment

We present Contrastive Neighborhood Alignment (CNA), a manifold learning...
research
07/20/2023

Cluster-aware Semi-supervised Learning: Relational Knowledge Distillation Provably Learns Clustering

Despite the empirical success and practical significance of (relational)...
research
11/30/2022

Self-Supervised Continual Graph Learning in Adaptive Riemannian Spaces

Continual graph learning routinely finds its role in a variety of real-w...

Please sign up or login with your details

Forgot password? Click here to reset