R2D2: Relational Text Decoding with Transformers

05/10/2021
by   Aryan Arbabi, et al.
0

We propose a novel framework for modeling the interaction between graphical structures and the natural language text associated with their nodes and edges. Existing approaches typically fall into two categories. On group ignores the relational structure by converting them into linear sequences and then utilize the highly successful Seq2Seq models. The other side ignores the sequential nature of the text by representing them as fixed-dimensional vectors and apply graph neural networks. Both simplifications lead to information loss. Our proposed method utilizes both the graphical structure as well as the sequential nature of the texts. The input to our model is a set of text segments associated with the nodes and edges of the graph, which are then processed with a transformer encoder-decoder model, equipped with a self-attention mechanism that is aware of the graphical relations between the nodes containing the segments. This also allows us to use BERT-like models that are already trained on large amounts of text. While the proposed model has wide applications, we demonstrate its capabilities on data-to-text generation tasks. Our approach compares favorably against state-of-the-art methods in four tasks without tailoring the model architecture. We also provide an early demonstration in a novel practical application – generating clinical notes from the medical entities mentioned during clinical visits.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/09/2020

Graph-Aware Transformer: Is Attention All Graphs Need?

Graphs are the natural data structure to represent relational and struct...
research
04/04/2019

Text Generation from Knowledge Graphs with Graph Transformers

Generating texts which express complex ideas spanning multiple sentences...
research
06/16/2020

Modeling Graph Structure via Relative Position for Better Text Generation from Knowledge Graphs

We present a novel encoder-decoder architecture for graph-to-text genera...
research
10/11/2022

Relational Attention: Generalizing Transformers for Graph-Structured Tasks

Transformers flexibly operate over sets of real-valued vectors represent...
research
09/15/2022

Graph-to-Text Generation with Dynamic Structure Pruning

Most graph-to-text works are built on the encoder-decoder framework with...
research
03/25/2022

Gransformer: Transformer-based Graph Generation

Transformers have become widely used in modern models for various tasks ...

Please sign up or login with your details

Forgot password? Click here to reset