Neural Data-to-Text Generation via Jointly Learning the Segmentation and Correspondence

05/03/2020
by   Xiaoyu Shen, et al.
0

The neural attention model has achieved great success in data-to-text generation tasks. Though usually excelling at producing fluent text, it suffers from the problem of information missing, repetition and "hallucination". Due to the black-box nature of the neural attention architecture, avoiding these problems in a systematic way is non-trivial. To address this concern, we propose to explicitly segment target text into fragment units and align them with their data correspondences. The segmentation and correspondence are jointly learned as latent variables without any human annotations. We further impose a soft statistical constraint to regularize the segmental granularity. The resulting architecture maintains the same expressive power as neural attention models, while being able to generate fully interpretable outputs with several times less computational cost. On both E2E and WebNLG benchmarks, we show the proposed model consistently outperforms its neural attention counterparts.

READ FULL TEXT

page 6

page 8

page 13

research
09/10/2019

Select and Attend: Towards Controllable Content Selection in Text Generation

Many text generation tasks naturally contain two steps: content selectio...
research
11/27/2017

Neural Text Generation: A Practical Guide

Deep learning methods have recently achieved great empirical success on ...
research
05/24/2019

mu-Forcing: Training Variational Recurrent Autoencoders for Text Generation

It has been previously observed that training Variational Recurrent Auto...
research
04/05/2022

latent-GLAT: Glancing at Latent Variables for Parallel Text Generation

Recently, parallel text generation has received widespread attention due...
research
02/28/2022

Data-to-text Generation with Variational Sequential Planning

We consider the task of data-to-text generation, which aims to create te...
research
06/13/2018

Polynomial Regression As an Alternative to Neural Nets

Despite the success of neural networks (NNs), there is still a concern a...

Please sign up or login with your details

Forgot password? Click here to reset