Self-Attention in Colors: Another Take on Encoding Graph Structure in Transformers

04/21/2023
by   Romain Menegaux, et al.
0

We introduce a novel self-attention mechanism, which we call CSA (Chromatic Self-Attention), which extends the notion of attention scores to attention _filters_, independently modulating the feature channels. We showcase CSA in a fully-attentional graph Transformer CGT (Chromatic Graph Transformer) which integrates both graph structural information and edge features, completely bypassing the need for local message-passing components. Our method flexibly encodes graph structure through node-node interactions, by enriching the original edge features with a relative positional encoding scheme. We propose a new scheme based on random walks that encodes both structural and positional information, and show how to incorporate higher-order topological information, such as rings in molecular graphs. Our approach achieves state-of-the-art results on the ZINC benchmark dataset, while providing a flexible framework for encoding graph structure and incorporating higher-order topology.

READ FULL TEXT
research
02/07/2022

Structure-Aware Transformer for Graph Representation Learning

The Transformer architecture has gained growing attention in graph repre...
research
06/10/2021

GraphiT: Encoding Graph Structure in Transformers

We show that viewing graphs as sets of node features and incorporating s...
research
08/07/2021

Edge-augmented Graph Transformers: Global Self-attention is Enough for Graphs

Transformer neural networks have achieved state-of-the-art results for u...
research
12/18/2019

CoulGAT: An Experiment on Interpretability of Graph Attention Networks

We present an attention mechanism inspired from definition of screened C...
research
05/28/2022

So3krates – Self-attention for higher-order geometric interactions on arbitrary length-scales

The application of machine learning methods in quantum chemistry has ena...
research
09/16/2022

Cell Attention Networks

Since their introduction, graph attention networks achieved outstanding ...
research
04/07/2020

Is Graph Structure Necessary for Multi-hop Reasoning?

Recently, many works attempt to model texts as graph structure and intro...

Please sign up or login with your details

Forgot password? Click here to reset