In Conclusion Not Repetition: Comprehensive Abstractive Summarization With Diversified Attention Based On Determinantal Point Processes

09/24/2019
by   Lei Li, et al.
0

Various Seq2Seq learning models designed for machine translation were applied for abstractive summarization task recently. Despite these models provide high ROUGE scores, they are limited to generate comprehensive summaries with a high level of abstraction due to its degenerated attention distribution. We introduce Diverse Convolutional Seq2Seq Model(DivCNN Seq2Seq) using Determinantal Point Processes methods(Micro DPPs and Macro DPPs) to produce attention distribution considering both quality and diversity. Without breaking the end to end architecture, DivCNN Seq2Seq achieves a higher level of comprehensiveness compared to vanilla models and strong baselines. All the reproducible codes and datasets are available online.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/16/2018

A Unified Model for Extractive and Abstractive Summarization using Inconsistency Loss

We propose a unified model combining the strength of extractive and abst...
research
05/25/2021

Focus Attention: Promoting Faithfulness and Diversity in Summarization

Professional summaries are written with document-level information, such...
research
09/02/2015

A Neural Attention Model for Abstractive Sentence Summarization

Summarization based on text extraction is inherently limited, but genera...
research
09/15/2020

Attention-Aware Inference for Neural Abstractive Summarization

Inspired by Google's Neural Machine Translation (NMT) <cit.> that models...
research
04/26/2017

Diversity driven Attention Model for Query-based Abstractive Summarization

Abstractive summarization aims to generate a shorter version of the docu...
research
05/25/2018

Toward Extractive Summarization of Online Forum Discussions via Hierarchical Attention Networks

Forum threads are lengthy and rich in content. Concise thread summaries ...
research
02/09/2020

Abstractive Summarization for Low Resource Data using Domain Transfer and Data Synthesis

Training abstractive summarization models typically requires large amoun...

Please sign up or login with your details

Forgot password? Click here to reset