Graph-Aware Transformer: Is Attention All Graphs Need?

06/09/2020
by   Sanghyun Yoo, et al.
0

Graphs are the natural data structure to represent relational and structural information in many domains. To cover the broad range of graph-data applications including graph classification as well as graph generation, it is desirable to have a general and flexible model consisting of an encoder and a decoder that can handle graph data. Although the representative encoder-decoder model, Transformer, shows superior performance in various tasks especially of natural language processing, it is not immediately available for graphs due to their non-sequential characteristics. To tackle this incompatibility, we propose GRaph-Aware Transformer (GRAT), the first Transformer-based model which can encode and decode whole graphs in end-to-end fashion. GRAT is featured with a self-attention mechanism adaptive to the edge information and an auto-regressive decoding mechanism based on the two-path approach consisting of sub-graph encoding path and node-and-edge generation path for each decoding step. We empirically evaluated GRAT on multiple setups including encoder-based tasks such as molecule property predictions on QM9 datasets and encoder-decoder-based tasks such as molecule graph generation in the organic molecule synthesis domain. GRAT has shown very promising results including state-of-the-art performance on 4 regression tasks in QM9 benchmark.

READ FULL TEXT
research
09/08/2021

Sparsity and Sentence Structure in Encoder-Decoder Attention of Summarization Systems

Transformer models have achieved state-of-the-art results in a wide rang...
research
03/25/2022

Gransformer: Transformer-based Graph Generation

Transformers have become widely used in modern models for various tasks ...
research
05/10/2021

R2D2: Relational Text Decoding with Transformers

We propose a novel framework for modeling the interaction between graphi...
research
06/20/2023

Transforming Graphs for Enhanced Attribute-Based Clustering: An Innovative Graph Transformer Method

Graph Representation Learning (GRL) is an influential methodology, enabl...
research
04/03/2018

Graph2Seq: Graph to Sequence Learning with Attention-based Neural Networks

Celebrated Sequence to Sequence learning (Seq2Seq) and its fruitful vari...
research
05/05/2023

Online Gesture Recognition using Transformer and Natural Language Processing

The Transformer architecture is shown to provide a powerful machine tran...
research
06/04/2020

Auto-decoding Graphs

We present an approach to synthesizing new graph structures from empiric...

Please sign up or login with your details

Forgot password? Click here to reset