NAGphormer: Neighborhood Aggregation Graph Transformer for Node Classification in Large Graphs

06/10/2022
by   Jinsong Chen, et al.
0

Graph Transformers have demonstrated superiority on various graph learning tasks in recent years. However, the complexity of existing Graph Transformers scales quadratically with the number of nodes, making it hard to scale to graphs with thousands of nodes. To this end, we propose a Neighborhood Aggregation Graph Transformer (NAGphormer) that is scalable to large graphs with millions of nodes. Before feeding the node features into the Transformer model, NAGphormer constructs tokens for each node by a neighborhood aggregation module called Hop2Token. For each node, Hop2Token aggregates neighborhood features from each hop into a representation, and thereby produces a sequence of token vectors. Subsequently, the resulting sequence of different hop information serves as input to the Transformer model. By considering each node as a sequence, NAGphormer could be trained in a mini-batch manner and thus could scale to large graphs. NAGphormer further develops an attention-based readout function so as to learn the importance of each hop adaptively. We conduct extensive experiments on various popular benchmarks, including six small datasets and three large datasets. The results demonstrate that NAGphormer consistently outperforms existing Graph Transformers and mainstream Graph Neural Networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/22/2023

Tokenized Graph Transformer with Neighborhood Augmentation for Node Classification in Large Graphs

Graph Transformers, emerging as a new architecture for graph representat...
research
11/15/2022

Adaptive Multi-Neighborhood Attention based Transformer for Graph Representation Learning

By incorporating the graph structural information into Transformers, gra...
research
06/29/2022

Deformable Graph Transformer

Transformer-based models have been widely used and achieved state-of-the...
research
10/03/2021

Graph Pointer Neural Networks

Graph Neural Networks (GNNs) have shown advantages in various graph-base...
research
10/11/2022

Relational Attention: Generalizing Transformers for Graph-Structured Tasks

Transformers flexibly operate over sets of real-valued vectors represent...
research
05/20/2022

A Unified and Biologically-Plausible Relational Graph Representation of Vision Transformers

Vision transformer (ViT) and its variants have achieved remarkable succe...
research
10/08/2022

Hierarchical Graph Transformer with Adaptive Node Sampling

The Transformer architecture has achieved remarkable success in a number...

Please sign up or login with your details

Forgot password? Click here to reset