DeepAI
Log In Sign Up

Text Editing by Command

10/24/2020
by   Felix Faltings, et al.
0

A prevailing paradigm in neural text generation is one-shot generation, where text is produced in a single step. The one-shot setting is inadequate, however, when the constraints the user wishes to impose on the generated text are dynamic, especially when authoring longer documents. We address this limitation with an interactive text generation setting in which the user interacts with the system by issuing commands to edit existing text. To this end, we propose a novel text editing task, and introduce WikiDocEdits, a dataset of single-sentence edits crawled from Wikipedia. We show that our Interactive Editor, a transformer-based model trained on this dataset, outperforms baselines and obtains positive results in both automatic and human evaluations. We present empirical and qualitative analyses of this model's performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

11/03/2020

Data-to-Text Generation with Iterative Text Editing

We present a novel approach to data-to-text generation based on iterativ...
08/09/2022

High Recall Data-to-text Generation with Progressive Edit

Data-to-text (D2T) generation is the task of generating texts from struc...
08/11/2022

Draft, Command, and Edit: Controllable Text Editing in E-Commerce

Product description generation is a challenging and under-explored task....
09/30/2018

Text Morphing

In this paper, we introduce a novel natural language generation task, te...
02/20/2017

Post-edit Analysis of Collective Biography Generation

Text generation is increasingly common but often requires manual post-ed...
04/16/2021

An Empirical Study of Extrapolation in Text Generation with Scalar Control

We conduct an empirical evaluation of extrapolation performance when con...
09/27/2022

EditEval: An Instruction-Based Benchmark for Text Improvements

Evaluation of text generation to date has primarily focused on content c...