Text Infilling

01/01/2019
by   Wanrong Zhu, et al.
0

Recent years have seen remarkable progress of text generation in different contexts, such as the most common setting of generating text from scratch, and the emerging paradigm of retrieval-and-rewriting. Text infilling, which fills missing text portions of a sentence or paragraph, is also of numerous use in real life, yet is under-explored. Previous work has focused on restricted settings by either assuming single word per missing portion or limiting to a single missing portion to the end of the text. This paper studies the general task of text infilling, where the input text can have an arbitrary number of portions to be filled, each of which may require an arbitrary unknown number of tokens. We study various approaches for the task, including a self-attention model with segment-aware position encoding and bidirectional context modeling. We create extensive supervised data by masking out text with varying strategies. Experiments show the self-attention model greatly outperforms others, creating a strong baseline for future research.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/28/2018

SALSA-TEXT : self attentive latent space based adversarial text generation

Inspired by the success of self attention mechanism and Transformer arch...
research
02/02/2022

A Survey on Retrieval-Augmented Text Generation

Recently, retrieval-augmented text generation attracted increasing atten...
research
07/09/2018

Position-aware Self-attention with Relative Positional Encodings for Slot Filling

This paper describes how to apply self-attention with relative positiona...
research
05/08/2022

Robust (Controlled) Table-to-Text Generation with Structure-Aware Equivariance Learning

Controlled table-to-text generation seeks to generate natural language d...
research
09/22/2022

XF2T: Cross-lingual Fact-to-Text Generation for Low-Resource Languages

Multiple business scenarios require an automated generation of descripti...
research
12/05/2017

Deep Semantic Role Labeling with Self-Attention

Semantic Role Labeling (SRL) is believed to be a crucial step towards na...
research
01/04/2023

Text sampling strategies for predicting missing bibliographic links

The paper proposes various strategies for sampling text data when perfor...

Please sign up or login with your details

Forgot password? Click here to reset