Copy mechanism and tailored training for character-based data-to-text generation

04/26/2019
by   Marco Roberti, et al.
0

In the last few years, many different methods have been focusing on using deep recurrent neural networks for natural language generation. The most widely used sequence-to-sequence neural methods are word-based: as such, they need a pre-processing step called delexicalization (conversely, relexicalization) to deal with uncommon or unknown words. These forms of processing, however, give rise to models that depend on the vocabulary used and are not completely neural. In this work, we present an end-to-end sequence-to-sequence model with attention mechanism which reads and generates at a character level, no longer requiring delexicalization, tokenization, nor even lowercasing. Moreover, since characters constitute the common "building blocks" of every text, it also allows a more general approach to text generation, enabling the possibility to exploit transfer learning for training. These skills are obtained thanks to two major features: (i) the possibility to alternate between the standard generation mechanism and a copy one, which allows to directly copy input facts to produce outputs, and (ii) the use of an original training pipeline that further improves the quality of the generated texts. We also introduce a new dataset called E2E+, designed to highlight the copying capabilities of character-based models, that is a modified version of the well-known E2E dataset used in the E2E Challenge. We tested our model according to five broadly accepted metrics (including the widely used bleu), showing that it yields competitive performance with respect to both character-based and word-based approaches.

READ FULL TEXT

page 11

page 13

research
12/20/2021

May the Force Be with Your Copy Mechanism: Enhanced Supervised-Copy Method for Natural Language Generation

Recent neural sequence-to-sequence models with a copy mechanism have ach...
research
03/24/2016

Neural Text Generation from Structured Data with Application to the Biography Domain

This paper introduces a neural model for concept-to-text generation that...
research
10/11/2018

Sequence-to-Sequence Models for Data-to-Text Natural Language Generation: Word- vs. Character-based Processing and Output Diversity

We present a comparison of word-based and character-based sequence-to-se...
research
10/10/2018

End-to-End Content and Plan Selection for Data-to-Text Generation

Learning to generate fluent natural language from structured data with n...
research
09/08/2018

Operations Guided Neural Networks for High Fidelity Data-To-Text Generation

Recent neural models for data-to-text generation are mostly based on dat...
research
12/08/2020

The Role of Interpretable Patterns in Deep Learning for Morphology

We examine the role of character patterns in three tasks: morphological ...
research
07/04/2017

CharManteau: Character Embedding Models For Portmanteau Creation

Portmanteaus are a word formation phenomenon where two words are combine...

Please sign up or login with your details

Forgot password? Click here to reset