Attention Optimization for Abstractive Document Summarization

10/25/2019
by   Min Gui, et al.
0

Attention plays a key role in the improvement of sequence-to-sequence-based document summarization models. To obtain a powerful attention helping with reproducing the most salient information and avoiding repetitions, we augment the vanilla attention model from both local and global aspects. We propose an attention refinement unit paired with local variance loss to impose supervision on the attention model at each decoding step, and a global variance loss to optimize the attention distributions of all decoding steps from the global perspective. The performances on the CNN/Daily Mail dataset verify the effectiveness of our methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/14/2017

Neural Extractive Summarization with Side Information

Most extractive summarization methods focus on the main body of the docu...
research
12/03/2020

Bengali Abstractive News Summarization(BANS): A Neural Attention Approach

Abstractive summarization is the process of generating novel sentences b...
research
05/25/2021

Focus Attention: Promoting Faithfulness and Diversity in Summarization

Professional summaries are written with document-level information, such...
research
05/16/2018

A Unified Model for Extractive and Abstractive Summarization using Inconsistency Loss

We propose a unified model combining the strength of extractive and abst...
research
05/09/2022

ACM – Attribute Conditioning for Abstractive Multi Document Summarization

Abstractive multi document summarization has evolved as a task through t...
research
04/26/2017

Diversity driven Attention Model for Query-based Abstractive Summarization

Abstractive summarization aims to generate a shorter version of the docu...
research
08/19/2019

Topic Augmented Generator for Abstractive Summarization

Steady progress has been made in abstractive summarization with attentio...

Please sign up or login with your details

Forgot password? Click here to reset