DeepAI AI Chat
Log In Sign Up

A^2Q: Aggregation-Aware Quantization for Graph Neural Networks

02/01/2023
by   Zeyu Zhu, et al.
Shanghai Jiao Tong University
ia.ac.cn
0

As graph data size increases, the vast latency and memory consumption during inference pose a significant challenge to the real-world deployment of Graph Neural Networks (GNNs). While quantization is a powerful approach to reducing GNNs complexity, most previous works on GNNs quantization fail to exploit the unique characteristics of GNNs, suffering from severe accuracy degradation. Through an in-depth analysis of the topology of GNNs, we observe that the topology of the graph leads to significant differences between nodes, and most of the nodes in a graph appear to have a small aggregation value. Motivated by this, in this paper, we propose the Aggregation-Aware mixed-precision Quantization (A^2Q) for GNNs, where an appropriate bitwidth is automatically learned and assigned to each node in the graph. To mitigate the vanishing gradient problem caused by sparse connections between nodes, we propose a Local Gradient method to serve the quantization error of the node features as the supervision during training. We also develop a Nearest Neighbor Strategy to deal with the generalization on unseen graphs. Extensive experiments on eight public node-level and graph-level datasets demonstrate the generality and robustness of our proposed method. Compared to the FP32 models, our method can achieve up to a 18.6x (i.e., 1.70bit) compression ratio with negligible accuracy degradation. Morever, compared to the state-of-the-art quantization method, our method can achieve up to 11.4% and 9.5% accuracy improvements on the node-level and graph-level tasks, respectively, and up to 2x speedup on a dedicated hardware accelerator.

READ FULL TEXT

page 2

page 8

page 16

page 19

page 21

page 23

page 24

page 28

08/11/2020

Degree-Quant: Quantization-Aware Training for Graph Neural Networks

Graph neural networks (GNNs) have demonstrated strong performance on a w...
09/12/2021

Is Heterophily A Real Nightmare For Graph Neural Networks To Do Node Classification?

Graph Neural Networks (GNNs) extend basic Neural Networks (NNs) by using...
10/17/2021

Graph-less Neural Networks: Teaching Old MLPs New Tricks via Distillation

Graph Neural Networks (GNNs) have recently become popular for graph mach...
07/09/2020

SGQuant: Squeezing the Last Bit on Graph Neural Networks with Specialized Quantization

With the increasing popularity of graph-based learning, Graph Neural Net...
11/29/2022

Every Node Counts: Improving the Training of Graph Neural Networks on Node Classification

Graph Neural Networks (GNNs) are prominent in handling sparse and unstru...
09/22/2020

Scalable Adversarial Attack on Graph Neural Networks with Alternating Direction Method of Multipliers

Graph neural networks (GNNs) have achieved high performance in analyzing...