A Hierarchical Model for Data-to-Text Generation

12/20/2019
by   Clément Rebuffel, et al.
0

Transcribing structured data into natural language descriptions has emerged as a challenging task, referred to as "data-to-text". These structures generally regroup multiple elements, as well as their attributes. Most attempts rely on translation encoder-decoder methods which linearize elements into a sequence. This however loses most of the structure contained in the data. In this work, we propose to overpass this limitation with a hierarchical model that encodes the data-structure at the element-level and the structure level. Evaluations on RotoWire show the effectiveness of our model w.r.t. qualitative and quantitative metrics.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/08/2023

Decoder-Only or Encoder-Decoder? Interpreting Language Model as a Regularized Encoder-Decoder

The sequence-to-sequence (seq2seq) task aims at generating the target se...
research
10/17/2022

Table-To-Text generation and pre-training with TabT5

Encoder-only transformer models have been successfully applied to differ...
research
04/04/2019

Text Generation from Knowledge Graphs with Graph Transformers

Generating texts which express complex ideas spanning multiple sentences...
research
12/21/2022

Esports Data-to-commentary Generation on Large-scale Data-to-text Dataset

Esports, a sports competition using video games, has become one of the m...
research
02/04/2021

Controlling Hallucinations at Word Level in Data-to-Text Generation

Data-to-Text Generation (DTG) is a subfield of Natural Language Generati...
research
04/22/2019

BePT: A Process Translator for Sharing Process Models

Sharing process models on the web has emerged as a widely used concept. ...

Please sign up or login with your details

Forgot password? Click here to reset