Binary Graph Convolutional Network with Capacity Exploration

10/24/2022
by   Junfu Wang, et al.
0

The current success of Graph Neural Networks (GNNs) usually relies on loading the entire attributed graph for processing, which may not be satisfied with limited memory resources, especially when the attributed graph is large. This paper pioneers to propose a Binary Graph Convolutional Network (Bi-GCN), which binarizes both the network parameters and input node attributes and exploits binary operations instead of floating-point matrix multiplications for network compression and acceleration. Meanwhile, we also propose a new gradient approximation based back-propagation method to properly train our Bi-GCN. According to the theoretical analysis, our Bi-GCN can reduce the memory consumption by an average of  31x for both the network parameters and input data, and accelerate the inference speed by an average of  51x, on three citation networks, i.e., Cora, PubMed, and CiteSeer. Besides, we introduce a general approach to generalize our binarization method to other variants of GNNs, and achieve similar efficiencies. Although the proposed Bi-GCN and Bi-GNNs are simple yet efficient, these compressed networks may also possess a potential capacity problem, i.e., they may not have enough storage capacity to learn adequate representations for specific tasks. To tackle this capacity problem, an Entropy Cover Hypothesis is proposed to predict the lower bound of the width of Bi-GNN hidden layers. Extensive experiments have demonstrated that our Bi-GCN and Bi-GNNs can give comparable performances to the corresponding full-precision baselines on seven node classification datasets and verified the effectiveness of our Entropy Cover Hypothesis for solving the capacity problem.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/15/2020

Bi-GCN: Binary Graph Convolutional Network

Graph Neural Networks (GNNs) have achieved tremendous success in graph r...
research
01/24/2023

Neighborhood Homophily-Guided Graph Convolutional Network

Graph neural networks (GNNs) have achieved remarkable advances in graph-...
research
08/23/2023

Cached Operator Reordering: A Unified View for Fast GNN Training

Graph Neural Networks (GNNs) are a powerful tool for handling structured...
research
06/28/2022

H-GCN: A Graph Convolutional Network Accelerator on Versal ACAP Architecture

Graph Neural Networks (GNNs) have drawn tremendous attention due to thei...
research
09/12/2019

GResNet: Graph Residual Network for Reviving Deep GNNs from Suspended Animation

The existing graph neural networks (GNNs) based on the spectral graph co...
research
09/22/2020

Explainable, Stable, and Scalable Graph Convolutional Networks for Learning Graph Representation

The network embedding problem that maps nodes in a graph to vectors in E...
research
09/26/2019

PairNorm: Tackling Oversmoothing in GNNs

The performance of graph neural nets (GNNs) is known to gradually decrea...

Please sign up or login with your details

Forgot password? Click here to reset