Self-supervised Smoothing Graph Neural Networks

09/02/2020
by   Lu Yu, et al.
16

This paper studies learning node representations with GNNs for unsupervised scenarios. We make a theoretical understanding and empirical demonstration about the non-steady performance of GNNs over different graph datasets, when the supervision signals are not appropriately defined. The performance of GNNs depends on both the node feature smoothness and the graph locality. To smooth the discrepancy of node proximity measured by graph topology and node feature, we proposed KS2L - a novel graph Knowledge distillation regularized Self-Supervised Learning framework, with two complementary regularization modules, for intra-and cross-model graph knowledge distillation. We demonstrate the competitive performance of KS2L on a variety of benchmarks. Even with a single GCN layer, KS2L has consistently competitive or even better performance on various benchmark datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/25/2022

Addressing Over-Smoothing in Graph Neural Networks via Deep Supervision

Learning useful node and graph representations with graph neural network...
research
10/25/2022

Online Cross-Layer Knowledge Distillation on Graph Neural Networks with Deep Supervision

Graph neural networks (GNNs) have become one of the most popular researc...
research
10/17/2021

Graph-less Neural Networks: Teaching Old MLPs New Tricks via Distillation

Graph Neural Networks (GNNs) have recently become popular for graph mach...
research
02/27/2023

Graph-based Knowledge Distillation: A survey and experimental evaluation

Graph, such as citation networks, social networks, and transportation ne...
research
09/04/2023

Layer-wise training for self-supervised learning on graphs

End-to-end training of graph neural networks (GNN) on large graphs prese...
research
10/29/2021

Node Feature Extraction by Self-Supervised Multi-scale Neighborhood Prediction

Learning on graphs has attracted significant attention in the learning c...
research
06/26/2023

Accelerating Molecular Graph Neural Networks via Knowledge Distillation

Recent advances in graph neural networks (GNNs) have allowed molecular s...

Please sign up or login with your details

Forgot password? Click here to reset