Log In Sign Up

Text Editing by Command

by   Felix Faltings, et al.

A prevailing paradigm in neural text generation is one-shot generation, where text is produced in a single step. The one-shot setting is inadequate, however, when the constraints the user wishes to impose on the generated text are dynamic, especially when authoring longer documents. We address this limitation with an interactive text generation setting in which the user interacts with the system by issuing commands to edit existing text. To this end, we propose a novel text editing task, and introduce WikiDocEdits, a dataset of single-sentence edits crawled from Wikipedia. We show that our Interactive Editor, a transformer-based model trained on this dataset, outperforms baselines and obtains positive results in both automatic and human evaluations. We present empirical and qualitative analyses of this model's performance.


page 1

page 2

page 3

page 4


Data-to-Text Generation with Iterative Text Editing

We present a novel approach to data-to-text generation based on iterativ...

High Recall Data-to-text Generation with Progressive Edit

Data-to-text (D2T) generation is the task of generating texts from struc...

Draft, Command, and Edit: Controllable Text Editing in E-Commerce

Product description generation is a challenging and under-explored task....

Text Morphing

In this paper, we introduce a novel natural language generation task, te...

Post-edit Analysis of Collective Biography Generation

Text generation is increasingly common but often requires manual post-ed...

An Empirical Study of Extrapolation in Text Generation with Scalar Control

We conduct an empirical evaluation of extrapolation performance when con...

EditEval: An Instruction-Based Benchmark for Text Improvements

Evaluation of text generation to date has primarily focused on content c...