Focus Attention: Promoting Faithfulness and Diversity in Summarization

05/25/2021
by   Rahul Aralikatte, et al.
0

Professional summaries are written with document-level information, such as the theme of the document, in mind. This is in contrast with most seq2seq decoders which simultaneously learn to focus on salient content, while deciding what to generate, at each decoding step. With the motivation to narrow this gap, we introduce Focus Attention Mechanism, a simple yet effective method to encourage decoders to proactively generate tokens that are similar or topical to the input document. Further, we propose a Focus Sampling method to enable generation of diverse summaries, an area currently understudied in summarization. When evaluated on the BBC extreme summarization task, two state-of-the-art models augmented with Focus Attention generate summaries that are closer to the target and more faithful to their input documents, outperforming their vanilla counterparts on and multiple faithfulness measures. We also empirically demonstrate that Focus Sampling is more effective in generating diverse and faithful summaries than top-k or nucleus sampling-based decoding methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/15/2021

Question-Based Salient Span Selection for More Controllable Text Summarization

In this work, we propose a method for incorporating question-answering (...
research
05/12/2022

Falsesum: Generating Document-level NLI Examples for Recognizing Factual Inconsistency in Summarization

Neural abstractive summarization models are prone to generate summaries ...
research
09/24/2019

In Conclusion Not Repetition: Comprehensive Abstractive Summarization With Diversified Attention Based On Determinantal Point Processes

Various Seq2Seq learning models designed for machine translation were ap...
research
10/25/2019

Attention Optimization for Abstractive Document Summarization

Attention plays a key role in the improvement of sequence-to-sequence-ba...
research
02/16/2023

Learning with Rejection for Abstractive Text Summarization

State-of-the-art abstractive summarization systems frequently hallucinat...
research
07/20/2018

Abstractive and Extractive Text Summarization using Document Context Vector and Recurrent Neural Networks

Sequence to sequence (Seq2Seq) learning has recently been used for abstr...
research
04/26/2017

Diversity driven Attention Model for Query-based Abstractive Summarization

Abstractive summarization aims to generate a shorter version of the docu...

Please sign up or login with your details

Forgot password? Click here to reset