Fast Graph Attention Networks Using Effective Resistance Based Graph Sparsification

06/15/2020
by   Rakshith S Srinivasa, et al.
0

The attention mechanism has demonstrated superior performance for inference over nodes in graph neural networks (GNNs), however, they result in a high computational burden during both training and inference. We propose FastGAT, a method to make attention based GNNs lightweight by using spectral sparsification to generate an optimal pruning of the input graph. This results in a per-epoch time that is almost linear in the number of graph nodes as opposed to quadratic. Further, we provide a re-formulation of a specific attention based GNN, Graph Attention Network (GAT) that interprets it as a graph convolution method using the random walk normalized graph Laplacian. Using this framework, we theoretically prove that spectral sparsification preserves the features computed by the GAT model, thereby justifying our FastGAT algorithm. We experimentally evaluate FastGAT on several large real world graph datasets for node classification tasks, FastGAT can dramatically reduce (up to 10x) the computational time and memory requirements, allowing the usage of attention based GNNs on large graphs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/04/2023

Towards Deep Attention in Graph Neural Networks: Problems and Remedies

Graph neural networks (GNNs) learn the representation of graph-structure...
research
05/22/2023

Causal-Based Supervision of Attention in Graph Neural Network: A Better and Simpler Choice towards Powerful Attention

In recent years, attention mechanisms have demonstrated significant pote...
research
07/04/2019

Improving Attention Mechanism in Graph Neural Networks via Cardinality Preservation

Graph Neural Networks (GNNs) are powerful to learn the representation of...
research
05/25/2023

Demystifying Oversmoothing in Attention-Based Graph Neural Networks

Oversmoothing in Graph Neural Networks (GNNs) refers to the phenomenon w...
research
11/11/2021

Sequential Aggregation and Rematerialization: Distributed Full-batch Training of Graph Neural Networks on Large Graphs

We present the Sequential Aggregation and Rematerialization (SAR) scheme...
research
04/08/2020

Improving Expressivity of Graph Neural Networks

We propose a Graph Neural Network with greater expressive power than com...
research
01/10/2022

Wind Park Power Prediction: Attention-Based Graph Networks and Deep Learning to Capture Wake Losses

With the increased penetration of wind energy into the power grid, it ha...

Please sign up or login with your details

Forgot password? Click here to reset