Transformer-Based Neural Text Generation with Syntactic Guidance

10/05/2020
by   Yinghao Li, et al.
0

We study the problem of using (partial) constituency parse trees as syntactic guidance for controlled text generation. Existing approaches to this problem use recurrent structures, which not only suffer from the long-term dependency problem but also falls short in modeling the tree structure of the syntactic guidance. We propose to leverage the parallelism of Transformer to better incorporate parse trees. Our method first expands a partial template constituency parse tree to a full-fledged parse tree tailored for the input source text, and then uses the expanded tree to guide text generation. The effectiveness of our model in this process hinges upon two new attention mechanisms: 1) a path attention mechanism that forces one node to attend to only other nodes located in its path in the syntax tree to better incorporate syntax guidance; 2) a multi-encoder attention mechanism that allows the decoder to dynamically attend to information from multiple encoders. Our experiments in the controlled paraphrasing task show that our method outperforms SOTA models both semantically and syntactically, improving the best baseline's BLEU score from 11.83 to 26.27.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/20/2023

Explicit Syntactic Guidance for Neural Text Generation

Most existing text generation models follow the sequence-to-sequence par...
research
08/29/2018

On Tree-Based Neural Sentence Modeling

Neural networks with tree-based sentence encoders have shown better resu...
research
05/18/2020

Syntax-guided Controlled Generation of Paraphrases

Given a sentence (e.g., "I like mangoes") and a constraint (e.g., sentim...
research
04/05/2020

Syntax-driven Iterative Expansion Language Models for Controllable Text Generation

The dominant language modeling paradigms handle text as a sequence of di...
research
08/27/2021

Tree Decomposition Attention for AMR-to-Text Generation

Text generation from AMR requires mapping a semantic graph to a string t...
research
08/27/2021

Latent Tree Decomposition Parsers for AMR-to-Text Generation

Graph encoders in AMR-to-text generation models often rely on neighborho...
research
01/30/2021

Triple M: A Practical Neural Text-to-speech System With Multi-guidance Attention And Multi-band Multi-time Lpcnet

In this work, a robust and efficient text-to-speech system, named Triple...

Please sign up or login with your details

Forgot password? Click here to reset