Deleter: Leveraging BERT to Perform Unsupervised Successive Text Compression

09/07/2019
by   Tong Niu, et al.
0

Text compression has diverse applications such as Summarization, Reading Comprehension and Text Editing. However, almost all existing approaches require either hand-crafted features, syntactic labels or parallel data. Even for one that achieves this task in an unsupervised setting, its architecture necessitates a task-specific autoencoder. Moreover, these models only generate one compressed sentence for each source input, so that adapting to different style requirements (e.g. length) for the final output usually implies retraining the model from scratch. In this work, we propose a fully unsupervised model, Deleter, that is able to discover an "optimal deletion path" for an arbitrary sentence, where each intermediate sequence along the path is a coherent subsequence of the previous one. This approach relies exclusively on a pretrained bidirectional language model (BERT) to score each candidate deletion based on the average Perplexity of the resulting sentence and performs progressive greedy lookahead search to select the best deletion for each step. We apply Deleter to the task of extractive Sentence Compression, and found that our model is competitive with state-of-the-art supervised models trained on 1.02 million in-domain examples with similar compression ratio. Qualitative analysis, as well as automatic and human evaluations both verify that our model produces high-quality compression.

READ FULL TEXT

page 1

page 2

page 3

page 4

page 5

research
04/07/2019

SEQ^3: Differentiable Sequence-to-Sequence-to-Sequence Autoencoder for Unsupervised Abstractive Sentence Compression

Neural sequence-to-sequence models are currently the dominant approach i...
research
04/15/2019

Improving Human Text Comprehension through Semi-Markov CRF-based Neural Section Title Generation

Titles of short sections within long documents support readers by guidin...
research
03/29/2016

Learning-Based Single-Document Summarization with Compression and Anaphoricity Constraints

We present a discriminative model for single-document summarization that...
research
10/19/2019

XL-Editor: Post-editing Sentences with XLNet

While neural sequence generation models achieve initial success for many...
research
06/05/2020

Sentence Compression as Deletion with Contextual Embeddings

Sentence compression is the task of creating a shorter version of an inp...
research
02/02/2020

A Difference-of-Convex Programming Approach With Parallel Branch-and-Bound For Sentence Compression Via A Hybrid Extractive Model

Sentence compression is an important problem in natural language process...
research
10/28/2015

Fast k-best Sentence Compression

A popular approach to sentence compression is to formulate the task as a...

Please sign up or login with your details

Forgot password? Click here to reset