Unsupervised Natural Language Generation with Denoising Autoencoders

04/21/2018
by   Markus Freitag, et al.
0

Generating text from structured data is important for various tasks such as question answering and dialog systems. We show that in at least one domain, without any supervision and only based on unlabeled text, we are able to build a Natural Language Generation (NLG) system with higher performance than supervised approaches. In our approach, we interpret the structured data as a corrupt representation of the desired output and use a denoising auto-encoder to reconstruct the sentence. We show how to introduce noise into training examples that do not contain structured data, and that the resulting denoising auto-encoder generalizes to generate correct sentences when given structured data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/01/2017

Order-Planning Neural Text Generation From Structured Data

Generating texts from structured data (e.g., a table) is important for v...
research
11/15/2022

An FNet based Auto Encoder for Long Sequence News Story Generation

In this paper, we design an auto encoder based off of Google's FNet Arch...
research
09/14/2022

vec2text with Round-Trip Translations

We investigate models that can generate arbitrary natural language text ...
research
09/07/2018

Unsupervised Sentence Compression using Denoising Auto-Encoders

In sentence compression, the task of shortening sentences while retainin...
research
11/16/2022

Towards Computationally Verifiable Semantic Grounding for Language Models

The paper presents an approach to semantic grounding of language models ...
research
10/05/2018

Scalable Micro-planned Generation of Discourse from Structured Data

We present a framework for generating natural language description from ...
research
06/22/2019

Semantically Driven Auto-completion

The Bloomberg Terminal has been a leading source of financial data and a...

Please sign up or login with your details

Forgot password? Click here to reset