Fast and Effective GNN Training with Linearized Random Spanning Trees

06/07/2023
by   Francesco Bonchi, et al.
0

We present a new effective and scalable framework for training GNNs in supervised node classification tasks, given graph-structured data. Our approach increasingly refines the weight update operations on a sequence of path graphs obtained by linearizing random spanning trees extracted from the input network. The path graphs are designed to retain essential topological and node information of the original graph. At the same time, the sparsity of path graphs enables a much lighter GNN training which, besides scalability, helps in mitigating classical training issues, like over-squashing and over-smoothing. We carry out an extensive experimental investigation on a number of real-world graph benchmarks, where we apply our framework to graph convolutional networks, showing simultaneous improvement of both training speed and test accuracy, as compared to well-known baselines.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/01/2023

Simple yet Effective Gradient-Free Graph Convolutional Networks

Linearized Graph Neural Networks (GNNs) have attracted great attention i...
research
08/15/2022

ROLAND: Graph Learning Framework for Dynamic Graphs

Graph Neural Networks (GNNs) have been successfully applied to many real...
research
08/06/2022

Triple Sparsification of Graph Convolutional Networks without Sacrificing the Accuracy

Graph Neural Networks (GNNs) are widely used to perform different machin...
research
10/29/2021

Topological Relational Learning on Graphs

Graph neural networks (GNNs) have emerged as a powerful tool for graph c...
research
09/11/2022

Towards Sparsification of Graph Neural Networks

As real-world graphs expand in size, larger GNN models with billions of ...
research
04/16/2021

SGL: Spectral Graph Learning from Measurements

This work introduces a highly scalable spectral graph densification fram...

Please sign up or login with your details

Forgot password? Click here to reset