VQGraph: Graph Vector-Quantization for Bridging GNNs and MLPs

08/04/2023
by   Ling Yang, et al.
0

Graph Neural Networks (GNNs) conduct message passing which aggregates local neighbors to update node representations. Such message passing leads to scalability issues in practical latency-constrained applications. To address this issue, recent methods adopt knowledge distillation (KD) to learn computationally-efficient multi-layer perceptron (MLP) by mimicking the output of GNN. However, the existing GNN representation space may not be expressive enough for representing diverse local structures of the underlying graph, which limits the knowledge transfer from GNN to MLP. Here we present a novel framework VQGraph to learn a powerful graph representation space for bridging GNNs and MLPs. We adopt the encoder of a variant of a vector-quantized variational autoencoder (VQ-VAE) as a structure-aware graph tokenizer, which explicitly represents the nodes of diverse local structures as numerous discrete tokens and constitutes a meaningful codebook. Equipped with the learned codebook, we propose a new token-based distillation objective based on soft token assignments to sufficiently transfer the structural knowledge from GNN to MLP. Extensive experiments and analyses demonstrate the strong performance of VQGraph, where we achieve new state-of-the-art performance on GNN-MLP distillation in both transductive and inductive settings across seven graph datasets. We show that VQGraph with better performance infers faster than GNNs by 828x, and also achieves accuracy improvement over GNNs and stand-alone MLPs by 3.90 https://github.com/YangLing0818/VQGraph.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/02/2022

Gradient Gating for Deep Multi-Rate Learning on Graphs

We present Gradient Gating (G^2), a novel framework for improving the pe...
research
10/17/2021

Graph-less Neural Networks: Teaching Old MLPs New Tricks via Distillation

Graph Neural Networks (GNNs) have recently become popular for graph mach...
research
10/12/2022

Boosting Graph Neural Networks via Adaptive Knowledge Distillation

Graph neural networks (GNNs) have shown remarkable performance on divers...
research
04/20/2023

Train Your Own GNN Teacher: Graph-Aware Distillation on Textual Graphs

How can we learn effective node representations on textual graphs? Graph...
research
10/27/2021

VQ-GNN: A Universal Framework to Scale up Graph Neural Networks using Vector Quantization

Most state-of-the-art Graph Neural Networks (GNNs) can be defined as a f...
research
09/02/2022

A Class-Aware Representation Refinement Framework for Graph Classification

Graph Neural Networks (GNNs) are widely used for graph representation le...
research
05/31/2023

Graph Entropy Minimization for Semi-supervised Node Classification

Node classifiers are required to comprehensively reduce prediction error...

Please sign up or login with your details

Forgot password? Click here to reset