How Attentive are Graph Attention Networks?

05/30/2021
by   Shaked Brody, et al.
0

Graph Attention Networks (GATs) are one of the most popular GNN architectures and are considered as the state-of-the-art architecture for representation learning with graphs. In GAT, every node attends to its neighbors given its own representation as the query. However, in this paper we show that GATs can only compute a restricted kind of attention where the ranking of attended nodes is unconditioned on the query node. We formally define this restricted kind of attention as static attention and distinguish it from a strictly more expressive dynamic attention. Because GATs use a static attention mechanism, there are simple graph problems that GAT cannot express: in a controlled problem, we show that static attention hinders GAT from even fitting the training data. To remove this limitation, we introduce a simple fix by modifying the order of operations and propose GATv2: a dynamic graph attention variant that is strictly more expressive than GAT. We perform an extensive evaluation and show that GATv2 outperforms GAT across 11 OGB and other benchmarks while we match their parametric costs. Our code is available at https://github.com/tech-srl/how_attentive_are_gats .

READ FULL TEXT
research
07/29/2020

Linear Attention Mechanism: An Efficient Attention for Semantic Segmentation

In this paper, to remedy this deficiency, we propose a Linear Attention ...
research
05/19/2023

Graph Propagation Transformer for Graph Representation Learning

This paper presents a novel transformer architecture for graph represent...
research
12/07/2022

Dynamic Graph Node Classification via Time Augmentation

Node classification for graph-structured data aims to classify nodes who...
research
05/27/2021

Learning Dynamic Graph Representation of Brain Connectome with Spatio-Temporal Attention

Functional connectivity (FC) between regions of the brain can be assesse...
research
05/12/2023

Fisher Information Embedding for Node and Graph Learning

Attention-based graph neural networks (GNNs), such as graph attention ne...
research
06/23/2022

Dynamic Scene Deblurring Base on Continuous Cross-Layer Attention Transmission

The deep convolutional neural networks (CNNs) using attention mechanism ...
research
03/29/2023

GRAF: Graph Attention-aware Fusion Networks

A large number of real-world networks include multiple types of nodes an...

Please sign up or login with your details

Forgot password? Click here to reset