Discourse Embellishment Using a Deep Encoder-Decoder Network

10/18/2018
by   Leonid Berov, et al.
0

We suggest a new NLG task in the context of the discourse generation pipeline of computational storytelling systems. This task, textual embellishment, is defined by taking a text as input and generating a semantically equivalent output with increased lexical and syntactic complexity. Ideally, this would allow the authors of computational storytellers to implement just lightweight NLG systems and use a domain-independent embellishment module to translate its output into more literary text. We present promising first results on this task using LSTM Encoder-Decoder networks trained on the WikiLarge dataset. Furthermore, we introduce "Compiled Computer Tales", a corpus of computationally generated stories, that can be used to test the capabilities of embellishment algorithms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/11/2017

Extending Automatic Discourse Segmentation for Texts in Spanish to Catalan

At present, automatic discourse analysis is a relevant research topic in...
research
05/31/2023

How Does Pretraining Improve Discourse-Aware Translation?

Pretrained language models (PLMs) have produced substantial improvements...
research
09/26/2021

Parallel Refinements for Lexically Constrained Text Generation with BART

Lexically constrained text generation aims to control the generated text...
research
10/29/2021

Deep Keyphrase Completion

Keyphrase provides accurate information of document content that is high...
research
01/28/2020

Incorporating Joint Embeddings into Goal-Oriented Dialogues with Multi-Task Learning

Attention-based encoder-decoder neural network models have recently show...
research
05/17/2023

Elaborative Simplification as Implicit Questions Under Discussion

Automated text simplification, a technique useful for making text more a...

Please sign up or login with your details

Forgot password? Click here to reset