XL-Editor: Post-editing Sentences with XLNet

10/19/2019
by   Yong-Siang Shih, et al.
0

While neural sequence generation models achieve initial success for many NLP applications, the canonical decoding procedure with left-to-right generation order (i.e., autoregressive) in one-pass can not reflect the true nature of human revising a sentence to obtain a refined result. In this work, we propose XL-Editor, a novel training framework that enables state-of-the-art generalized autoregressive pretraining methods, XLNet specifically, to revise a given sentence by the variable-length insertion probability. Concretely, XL-Editor can (1) estimate the probability of inserting a variable-length sequence into a specific position of a given sentence; (2) execute post-editing operations such as insertion, deletion, and replacement based on the estimated variable-length insertion probability; (3) complement existing sequence-to-sequence models to refine the generated sequences. Empirically, we first demonstrate better post-editing capabilities of XL-Editor over XLNet on the text insertion and deletion tasks, which validates the effectiveness of our proposed framework. Furthermore, we extend XL-Editor to the unpaired text style transfer task, where transferring the target style onto a given sentence can be naturally viewed as post-editing the sentence into the target style. XL-Editor achieves significant improvement in style transfer accuracy and also maintains coherent semantic of the original sentence, showing the broad applicability of our method.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/14/2022

Replacing Language Model for Style Transfer

We introduce replacing language model (RLM), a sequence-to-sequence lang...
research
01/27/2023

Prompt-Based Editing for Text Style Transfer

Prompting approaches have been recently explored in text style transfer,...
research
11/13/2017

QuickEdit: Editing Text & Translations via Simple Delete Actions

We propose a framework for computer-assisted text editing. It applies to...
research
10/25/2021

Actions Speak Louder than Listening: Evaluating Music Style Transfer based on Editing Experience

The subjective evaluation of music generation techniques has been mostly...
research
09/07/2019

Deleter: Leveraging BERT to Perform Unsupervised Successive Text Compression

Text compression has diverse applications such as Summarization, Reading...
research
12/17/2021

Transcribing Natural Languages for The Deaf via Neural Editing Programs

This work studies the task of glossification, of which the aim is to em ...
research
05/24/2022

Learning to Model Editing Processes

Most existing sequence generation models produce outputs in one pass, us...

Please sign up or login with your details

Forgot password? Click here to reset