Sparsifying the Update Step in Graph Neural Networks

09/02/2021
by   Johannes F. Lutzeyer, et al.
0

Message-Passing Neural Networks (MPNNs), the most prominent Graph Neural Network (GNN) framework, celebrate much success in the analysis of graph-structured data. Concurrently, the sparsification of Neural Network models attracts a great amount of academic and industrial interest. In this paper, we conduct a structured study of the effect of sparsification on the trainable part of MPNNs known as the Update step. To this end, we design a series of models to successively sparsify the linear transform in the Update step. Specifically, we propose the ExpanderGNN model with a tuneable sparsification rate and the Activation-Only GNN, which has no linear transform in the Update step. In agreement with a growing trend in the literature, the sparsification paradigm is changed by initialising sparse neural network architectures rather than expensively sparsifying already trained architectures. Our novel benchmark models enable a better understanding of the influence of the Update step on model performance and outperform existing simplified benchmark models such as the Simple Graph Convolution. The ExpanderGNNs, and in some cases the Activation-Only models, achieve performance on par with their vanilla counterparts on several downstream tasks while containing significantly fewer trainable parameters. In experiments with matching parameter numbers, our benchmark models outperform the state-of-the-art GNN models. Our code is publicly available at: https://github.com/ChangminWu/ExpanderGNN.

READ FULL TEXT
research
09/08/2020

Hierarchical Message-Passing Graph Neural Networks

Graph Neural Networks (GNNs) have become a promising approach to machine...
research
11/05/2022

Inductive Graph Transformer for Delivery Time Estimation

Providing accurate estimated time of package delivery on users' purchasi...
research
01/03/2022

KerGNNs: Interpretable Graph Neural Networks with Graph Kernels

Graph kernels are historically the most widely-used technique for graph ...
research
09/20/2023

Improving Article Classification with Edge-Heterogeneous Graph Neural Networks

Classifying research output into context-specific label taxonomies is a ...
research
03/30/2021

Parameterized Hypercomplex Graph Neural Networks for Graph Classification

Despite recent advances in representation learning in hypercomplex (HC) ...
research
03/28/2023

Cost Sensitive GNN-based Imbalanced Learning for Mobile Social Network Fraud Detection

With the rapid development of mobile networks, the people's social conta...
research
08/08/2021

Recurrent Graph Neural Networks for Rumor Detection in Online Forums

The widespread adoption of online social networks in daily life has crea...

Please sign up or login with your details

Forgot password? Click here to reset