AREDSUM: Adaptive Redundancy-Aware Iterative Sentence Ranking for Extractive Document Summarization

04/13/2020
by   Keping Bi, et al.
0

Redundancy-aware extractive summarization systems score the redundancy of the sentences to be included in a summary either jointly with their salience information or separately as an additional sentence scoring step. Previous work shows the efficacy of jointly scoring and selecting sentences with neural sequence generation models. It is, however, not well-understood if the gain is due to better encoding techniques or better redundancy reduction approaches. Similarly, the contribution of salience versus diversity components on the created summary is not studied well. Building on the state-of-the-art encoding methods for summarization, we present two adaptive learning models: AREDSUM-SEQ that jointly considers salience and novelty during sentence selection; and a two-step AREDSUM-CTX that scores salience first, then learns to balance salience and redundancy, enabling the measurement of the impact of each aspect. Empirical results on CNN/DailyMail and NYT50 datasets show that by modeling diversity explicitly in a separate step, AREDSUM-CTX achieves significantly better performance than AREDSUM-SEQ as well as state-of-the-art extractive summarization baselines.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/06/2018

Neural Document Summarization by Jointly Learning to Score and Select Sentences

Sentence scoring and sentence selection are two main steps in extractive...
research
01/20/2016

Improved Spoken Document Summarization with Coverage Modeling Techniques

Extractive summarization aims at selecting a set of indicative sentences...
research
04/24/2020

Exploring Explainable Selection to Control Abstractive Generation

It is a big challenge to model long-range input for document summarizati...
research
10/13/2020

KLearn: Background Knowledge Inference from Summarization Data

The goal of text summarization is to compress documents to the relevant ...
research
11/14/2016

Classify or Select: Neural Architectures for Extractive Document Summarization

We present two novel and contrasting Recurrent Neural Network (RNN) base...
research
07/16/2019

STRASS: A Light and Effective Method for Extractive Summarization Based on Sentence Embeddings

This paper introduces STRASS: Summarization by TRAnsformation Selection ...
research
11/30/2020

Systematically Exploring Redundancy Reduction in Summarizing Long Documents

Our analysis of large summarization datasets indicates that redundancy i...

Please sign up or login with your details

Forgot password? Click here to reset