Keeping Notes: Conditional Natural Language Generation with a Scratchpad Mechanism

06/12/2019
by   Ryan Y. Benmalek, et al.
0

We introduce the Scratchpad Mechanism, a novel addition to the sequence-to-sequence (seq2seq) neural network architecture and demonstrate its effectiveness in improving the overall fluency of seq2seq models for natural language generation tasks. By enabling the decoder at each time step to write to all of the encoder output layers, Scratchpad can employ the encoder as a "scratchpad" memory to keep track of what has been generated so far and thereby guide future generation. We evaluate Scratchpad in the context of three well-studied natural language generation tasks --- Machine Translation, Question Generation, and Text Summarization --- and obtain state-of-the-art or comparable performance on standard datasets for each task. Qualitative assessments in the form of human judgements (question generation), attention visualization (MT), and sample output (summarization) provide further evidence of the ability of Scratchpad to generate fluent and expressive output.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/24/2020

GRET: Global Representation Enhanced Transformer

Transformer, based on the encoder-decoder framework, has achieved state-...
research
12/23/2021

Measuring Attribution in Natural Language Generation Models

With recent improvements in natural language generation (NLG) models for...
research
11/28/2016

Joint Copying and Restricted Generation for Paraphrase

Many natural language generation tasks, such as abstractive summarizatio...
research
02/27/2023

Inseq: An Interpretability Toolkit for Sequence Generation Models

Past work in natural language processing interpretability focused mainly...
research
10/20/2018

Abstractive Summarization Using Attentive Neural Techniques

In a world of proliferating data, the ability to rapidly summarize text ...
research
03/21/2016

Incorporating Copying Mechanism in Sequence-to-Sequence Learning

We address an important problem in sequence-to-sequence (Seq2Seq) learni...
research
09/12/2018

Neural Melody Composition from Lyrics

In this paper, we study a novel task that learns to compose music from n...

Please sign up or login with your details

Forgot password? Click here to reset