Pure Transformers are Powerful Graph Learners

07/06/2022
by   Jinwoo Kim, et al.
0

We show that standard Transformers without graph-specific modifications can lead to promising results in graph learning both in theory and practice. Given a graph, we simply treat all nodes and edges as independent tokens, augment them with token embeddings, and feed them to a Transformer. With an appropriate choice of token embeddings, we prove that this approach is theoretically at least as expressive as an invariant graph network (2-IGN) composed of equivariant linear layers, which is already more expressive than all message-passing Graph Neural Networks (GNN). When trained on a large-scale graph dataset (PCQM4Mv2), our method coined Tokenized Graph Transformer (TokenGT) achieves significantly better results compared to GNN baselines and competitive results compared to Transformer variants with sophisticated graph-specific inductive bias. Our implementation is available at https://github.com/jw9730/tokengt.

READ FULL TEXT
research
06/11/2023

Local-to-global Perspectives on Graph Neural Networks

This thesis presents a local-to-global perspective on graph neural netwo...
research
03/10/2023

Exphormer: Sparse Transformers for Graphs

Graph transformers have emerged as a promising architecture for a variet...
research
11/05/2022

Inductive Graph Transformer for Delivery Time Estimation

Providing accurate estimated time of package delivery on users' purchasi...
research
10/27/2021

Transformers Generalize DeepSets and Can be Extended to Graphs and Hypergraphs

We present a generalization of Transformers to any-order permutation inv...
research
06/09/2021

Do Transformers Really Perform Bad for Graph Representation?

The Transformer architecture has become a dominant choice in many domain...
research
11/26/2022

PatchGT: Transformer over Non-trainable Clusters for Learning Graph Representations

Recently the Transformer structure has shown good performances in graph ...
research
06/05/2023

Towards Arbitrarily Expressive GNNs in O(n^2) Space by Rethinking Folklore Weisfeiler-Lehman

Message passing neural networks (MPNNs) have emerged as the most popular...

Please sign up or login with your details

Forgot password? Click here to reset