Simple Unsupervised Summarization by Contextual Matching

07/31/2019
by   Jiawei Zhou, et al.
0

We propose an unsupervised method for sentence summarization using only language modeling. The approach employs two language models, one that is generic (i.e. pretrained), and the other that is specific to the target domain. We show that by using a product-of-experts criteria these are enough for maintaining continuous contextual matching while maintaining output fluency. Experiments on both abstractive and extractive sentence summarization data sets show promising results of our method without being exposed to any paired data.

READ FULL TEXT
research
09/16/2019

BottleSum: Unsupervised and Self-supervised Sentence Summarization using the Information Bottleneck Principle

The principle of the Information Bottleneck (Tishby et al. 1999) is to p...
research
05/04/2020

Discrete Optimization for Unsupervised Sentence Summarization with Word-Level Extraction

Automatic sentence summarization produces a shorter version of a sentenc...
research
07/30/2019

Abstractive Document Summarization without Parallel Data

Abstractive summarization typically relies on large collections of paire...
research
09/01/2019

Repurposing Decoder-Transformer Language Models for Abstractive Summarization

Neural network models have shown excellent fluency and performance when ...
research
11/09/2022

Unsupervised Extractive Summarization with Heterogeneous Graph Embeddings for Chinese Document

In the scenario of unsupervised extractive summarization, learning high-...
research
04/05/2021

WhiteningBERT: An Easy Unsupervised Sentence Embedding Approach

Producing the embedding of a sentence in an unsupervised way is valuable...
research
10/09/2020

Q-learning with Language Model for Edit-based Unsupervised Summarization

Unsupervised methods are promising for abstractive text summarization in...

Please sign up or login with your details

Forgot password? Click here to reset