AGFormer: Efficient Graph Representation with Anchor-Graph Transformer

05/12/2023
by   Bo Jiang, et al.
0

To alleviate the local receptive issue of GCN, Transformers have been exploited to capture the long range dependences of nodes for graph data representation and learning. However, existing graph Transformers generally employ regular self-attention module for all node-to-node message passing which needs to learn the affinities/relationships between all node's pairs, leading to high computational cost issue. Also, they are usually sensitive to graph noises. To overcome this issue, we propose a novel graph Transformer architecture, termed Anchor Graph Transformer (AGFormer), by leveraging an anchor graph model. To be specific, AGFormer first obtains some representative anchors and then converts node-to-node message passing into anchor-to-anchor and anchor-to-node message passing process. Thus, AGFormer performs much more efficiently and also robustly than regular node-to-node Transformers. Extensive experiments on several benchmark datasets demonstrate the effectiveness and benefits of proposed AGFormer.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/22/2023

HINormer: Representation Learning On Heterogeneous Information Networks with Graph Transformer

Recent studies have highlighted the limitations of message-passing based...
research
05/30/2023

AMatFormer: Efficient Feature Matching via Anchor Matching Transformer

Learning based feature matching methods have been commonly studied in re...
research
05/27/2023

Graph Inductive Biases in Transformers without Message Passing

Transformers for graph data are increasingly widely studied and successf...
research
04/27/2022

GTNet: A Tree-Based Deep Graph Learning Architecture

We propose Graph Tree Networks (GTNets), a deep graph learning architect...
research
09/08/2023

Curve Your Attention: Mixed-Curvature Transformers for Graph Representation Learning

Real-world graphs naturally exhibit hierarchical or cyclical structures ...
research
09/01/2023

Where Did the Gap Go? Reassessing the Long-Range Graph Benchmark

The recent Long-Range Graph Benchmark (LRGB, Dwivedi et al. 2022) introd...
research
03/25/2022

Lightweight Graph Convolutional Networks with Topologically Consistent Magnitude Pruning

Graph convolution networks (GCNs) are currently mainstream in learning w...

Please sign up or login with your details

Forgot password? Click here to reset