A Generalization of Transformer Networks to Graphs

by   Vijay Prakash Dwivedi, et al.

We propose a generalization of transformer neural network architecture for arbitrary graphs. The original transformer was designed for Natural Language Processing (NLP), which operates on fully connected graphs representing all connections between the words in a sequence. Such architecture does not leverage the graph connectivity inductive bias, and can perform poorly when the graph topology is important and has not been encoded into the node features. We introduce a graph transformer with four new properties compared to the standard model. First, the attention mechanism is a function of the neighborhood connectivity for each node in the graph. Second, the positional encoding is represented by the Laplacian eigenvectors, which naturally generalize the sinusoidal positional encodings often used in NLP. Third, the layer normalization is replaced by a batch normalization layer, which provides faster training and better generalization performance. Finally, the architecture is extended to edge feature representation, which can be critical to tasks s.a. chemistry (bond type) or link prediction (entity relationship in knowledge graphs). Numerical experiments on a graph benchmark demonstrate the performance of the proposed graph transformer architecture. This work closes the gap between the original transformer, which was designed for the limited case of line graphs, and graph neural networks, that can work with arbitrary graphs. As our architecture is simple and generic, we believe it can be used as a black box for future applications that wish to consider transformer and graphs.


page 1

page 2

page 3

page 4


Graph Neural Networks with Learnable Structural and Positional Representations

Graph neural networks (GNNs) have become the standard learning architect...

Relphormer: Relational Graph Transformer for Knowledge Graph Representation

Transformers have achieved remarkable performance in widespread fields, ...

An Augmented Transformer Architecture for Natural Language Generation Tasks

The Transformer based neural networks have been showing significant adva...

Unleashing the Power of Transformer for Graphs

Despite recent successes in natural language processing and computer vis...

A Graph VAE and Graph Transformer Approach to Generating Molecular Graphs

We propose a combination of a variational autoencoder and a transformer ...

NAGphormer: Neighborhood Aggregation Graph Transformer for Node Classification in Large Graphs

Graph Transformers have demonstrated superiority on various graph learni...

Updater-Extractor Architecture for Inductive World State Representations

Developing NLP models traditionally involves two stages - training and a...

Code Repositories


Graph Transformer Architecture. Source code for "A Generalization of Transformer Networks to Graphs", DLG-AAAI'21.

view repo


copy of graphtranformer

view repo