Latent Tree Decomposition Parsers for AMR-to-Text Generation

08/27/2021
by   Lisa Jin, et al.
0

Graph encoders in AMR-to-text generation models often rely on neighborhood convolutions or global vertex attention. While these approaches apply to general graphs, AMRs may be amenable to encoders that target their tree-like structure. By clustering edges into a hierarchy, a tree decomposition summarizes graph structure. Our model encodes a derivation forest of tree decompositions and extracts an expected tree. From tree node embeddings, it builds graph edge features used in vertex attention of the graph encoder. Encoding TD forests instead of shortest-pairwise paths in a self-attentive baseline raises BLEU by 0.7 and chrF++ by 0.3. The forest encoder also surpasses a convolutional baseline for molecular property prediction by 1.92 ROC-AUC.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/27/2021

Tree Decomposition Attention for AMR-to-Text Generation

Text generation from AMR requires mapping a semantic graph to a string t...
research
06/16/2020

Modeling Graph Structure via Relative Position for Better Text Generation from Knowledge Graphs

We present a novel encoder-decoder architecture for graph-to-text genera...
research
06/20/2021

TD-GEN: Graph Generation With Tree Decomposition

We propose TD-GEN, a graph generation framework based on tree decomposit...
research
03/27/2019

Structural Neural Encoders for AMR-to-text Generation

AMR-to-text generation is a problem recently introduced to the NLP commu...
research
10/05/2020

Transformer-Based Neural Text Generation with Syntactic Guidance

We study the problem of using (partial) constituency parse trees as synt...
research
09/04/2018

Graph-based Deep-Tree Recursive Neural Network (DTRNN) for Text Classification

A novel graph-to-tree conversion mechanism called the deep-tree generati...

Please sign up or login with your details

Forgot password? Click here to reset