Tree Decomposition Attention for AMR-to-Text Generation

08/27/2021
by   Lisa Jin, et al.
0

Text generation from AMR requires mapping a semantic graph to a string that it annotates. Transformer-based graph encoders, however, poorly capture vertex dependencies that may benefit sequence prediction. To impose order on an encoder, we locally constrain vertex self-attention using a graph's tree decomposition. Instead of forming a full query-key bipartite graph, we restrict attention to vertices in parent, subtree, and same-depth bags of a vertex. This hierarchical context lends both sparsity and structure to vertex state updates. We apply dynamic programming to derive a forest of tree decompositions, choosing the most structurally similar tree to the AMR. Our system outperforms a self-attentive baseline by 1.6 BLEU and 1.8 chrF++.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/27/2021

Latent Tree Decomposition Parsers for AMR-to-Text Generation

Graph encoders in AMR-to-text generation models often rely on neighborho...
research
09/28/2018

SALSA-TEXT : self attentive latent space based adversarial text generation

Inspired by the success of self attention mechanism and Transformer arch...
research
03/27/2019

Structural Neural Encoders for AMR-to-text Generation

AMR-to-text generation is a problem recently introduced to the NLP commu...
research
10/05/2020

Transformer-Based Neural Text Generation with Syntactic Guidance

We study the problem of using (partial) constituency parse trees as synt...
research
09/14/2023

Dynamic programming on bipartite tree decompositions

We revisit a graph width parameter that we dub bipartite treewidth, alon...
research
11/01/2019

Obstructions for bounded shrub-depth and rank-depth

Shrub-depth and rank-depth are dense analogues of the tree-depth of a gr...
research
06/21/2020

Hierarchical Decompositions of dihypergraphs

In this paper we are interested in decomposing a dihypergraph ℋ = (V, ℰ)...

Please sign up or login with your details

Forgot password? Click here to reset