Summary Refinement through Denoising

07/25/2019
by   Nikola I. Nikolov, et al.
0

We propose a simple method for post-processing the outputs of a text summarization system in order to refine its overall quality. Our approach is to train text-to-text rewriting models to correct information redundancy errors that may arise during summarization. We train on synthetically generated noisy summaries, testing three different types of noise that introduce out-of-context information within each summary. When applied on top of extractive and abstractive summarization baselines, our summary denoising models yield metric improvements while reducing redundancy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/28/2018

The Rule of Three: Abstractive Text Summarization in Three Bullet Points

Neural network-based approaches have become widespread for abstractive t...
research
06/09/2022

CLTS+: A New Chinese Long Text Summarization Dataset with Abstractive Summaries

The abstractive methods lack of creative ability is particularly a probl...
research
04/19/2021

Improving Faithfulness in Abstractive Summarization with Contrast Candidate Generation and Selection

Despite significant progress in neural abstractive summarization, recent...
research
08/04/2023

Redundancy Aware Multi-Reference Based Gainwise Evaluation of Extractive Summarization

While very popular for evaluating extractive summarization task, the ROU...
research
11/06/2019

Optimizing the Factual Correctness of a Summary: A Study of Summarizing Radiology Reports

Neural abstractive summarization models are able to generate summaries w...
research
11/11/2022

Improving Factual Consistency in Summarization with Compression-Based Post-Editing

State-of-the-art summarization models still struggle to be factually con...
research
09/02/2019

SumQE: a BERT-based Summary Quality Estimation Model

We propose SumQE, a novel Quality Estimation model for summarization bas...

Please sign up or login with your details

Forgot password? Click here to reset