Tokenized Graph Transformer with Neighborhood Augmentation for Node Classification in Large Graphs

05/22/2023
by   Jinsong Chen, et al.
0

Graph Transformers, emerging as a new architecture for graph representation learning, suffer from the quadratic complexity on the number of nodes when handling large graphs. To this end, we propose a Neighborhood Aggregation Graph Transformer (NAGphormer) that treats each node as a sequence containing a series of tokens constructed by our proposed Hop2Token module. For each node, Hop2Token aggregates the neighborhood features from different hops into different representations, producing a sequence of token vectors as one input. In this way, NAGphormer could be trained in a mini-batch manner and thus could scale to large graphs. Moreover, we mathematically show that compared to a category of advanced Graph Neural Networks (GNNs), called decoupled Graph Convolutional Networks, NAGphormer could learn more informative node representations from multi-hop neighborhoods. In addition, we propose a new data augmentation method called Neighborhood Augmentation (NrAug) based on the output of Hop2Token that augments simultaneously the features of neighborhoods from global as well as local views to strengthen the training effect of NAGphormer. Extensive experiments on benchmark datasets from small to large demonstrate the superiority of NAGphormer against existing graph Transformers and mainstream GNNs, and the effectiveness of NrAug for further boosting NAGphormer.

READ FULL TEXT

page 9

page 10

research
06/10/2022

NAGphormer: Neighborhood Aggregation Graph Transformer for Node Classification in Large Graphs

Graph Transformers have demonstrated superiority on various graph learni...
research
12/29/2021

Deformable Graph Convolutional Networks

Graph neural networks (GNNs) have significantly improved the representat...
research
11/15/2022

Adaptive Multi-Neighborhood Attention based Transformer for Graph Representation Learning

By incorporating the graph structural information into Transformers, gra...
research
12/19/2020

A pipeline for fair comparison of graph neural networks in node classification tasks

Graph neural networks (GNNs) have been investigated for potential applic...
research
04/29/2019

Advancing GraphSAGE with A Data-Driven Node Sampling

As an efficient and scalable graph neural network, GraphSAGE has enabled...
research
10/25/2021

Gophormer: Ego-Graph Transformer for Node Classification

Transformers have achieved remarkable performance in a myriad of fields ...
research
08/07/2021

Edge-augmented Graph Transformers: Global Self-attention is Enough for Graphs

Transformer neural networks have achieved state-of-the-art results for u...

Please sign up or login with your details

Forgot password? Click here to reset