Neural data-to-text generation: A comparison between pipeline and end-to-end architectures

08/23/2019
by   Thiago castro Ferreira, et al.
0

Traditionally, most data-to-text applications have been designed using a modular pipeline architecture, in which non-linguistic input data is converted into natural language through several intermediate transformations. In contrast, recent neural models for data-to-text generation have been proposed as end-to-end approaches, where the non-linguistic input is rendered in natural language with much less explicit intermediate representations in-between. This study introduces a systematic comparison between neural pipeline and end-to-end data-to-text approaches for the generation of text from RDF triples. Both architectures were implemented making use of state-of-the art deep learning methods as the encoder-decoder Gated-Recurrent Units (GRU) and Transformer. Automatic and human evaluations together with a qualitative analysis suggest that having explicit intermediate steps in the generation process results in better texts than the ones generated by end-to-end approaches. Moreover, the pipeline models generalize better to unseen inputs. Data and code are publicly available.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/08/2022

Comparing Computational Architectures for Automated Journalism

The majority of NLG systems have been designed following either a templa...
research
04/08/2020

Have Your Text and Use It Too! End-to-End Neural Data-to-Text Generation with Semantic Fidelity

End-to-end neural data-to-text (D2T) generation has recently emerged as ...
research
03/03/2022

Deep Latent-Variable Models for Text Generation

Text generation aims to produce human-like natural language output for d...
research
09/07/2020

Adversarial Watermarking Transformer: Towards Tracing Text Provenance with Data Hiding

Recent advances in natural language generation have introduced powerful ...
research
07/31/2020

An Empirical Study on Explainable Prediction of Text Complexity: Preliminaries for Text Simplification

Text simplification is concerned with reducing the language complexity a...
research
05/21/2018

NeuralREG: An end-to-end approach to referring expression generation

Traditionally, Referring Expression Generation (REG) models first decide...
research
07/16/1999

Mixing representation levels: The hybrid approach to automatic text generation

Natural language generation systems (NLG) map non-linguistic representat...

Please sign up or login with your details

Forgot password? Click here to reset