Diversity driven Attention Model for Query-based Abstractive Summarization

04/26/2017
by   Preksha Nema, et al.
0

Abstractive summarization aims to generate a shorter version of the document covering all the salient points in a compact and coherent fashion. On the other hand, query-based summarization highlights those points that are relevant in the context of a given query. The encode-attend-decode paradigm has achieved notable success in machine translation, extractive summarization, dialog systems, etc. But it suffers from the drawback of generation of repeated phrases. In this work we propose a model for the query-based summarization task based on the encode-attend-decode paradigm with two key additions (i) a query attention model (in addition to document attention model) which learns to focus on different portions of the query at different time steps (instead of using a static representation for the query) and (ii) a new diversity based attention model which aims to alleviate the problem of repeating phrases in the summary. In order to enable the testing of this model we introduce a new query-based summarization dataset building on debatepedia. Our experiments show that with these two additions the proposed model clearly outperforms vanilla encode-attend-decode models with a gain of 28% (absolute) in ROUGE-L scores.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/27/2020

QBSUM: a Large-Scale Query-Based Document Summarization Dataset from Real-world Applications

Query-based document summarization aims to extract or generate a summary...
research
05/10/2020

From Standard Summarization to New Tasks and Beyond: Summarization with Manifold Information

Text summarization is the research area aiming at creating a short and c...
research
05/01/2017

Query-adaptive Video Summarization via Quality-aware Relevance Estimation

Although the problem of automatic video summarization has recently recei...
research
05/08/2021

Learning to Predict Repeatability of Interest Points

Many robotics applications require interest points that are highly repea...
research
05/25/2021

Focus Attention: Promoting Faithfulness and Diversity in Summarization

Professional summaries are written with document-level information, such...
research
09/24/2019

In Conclusion Not Repetition: Comprehensive Abstractive Summarization With Diversified Attention Based On Determinantal Point Processes

Various Seq2Seq learning models designed for machine translation were ap...
research
10/25/2019

Attention Optimization for Abstractive Document Summarization

Attention plays a key role in the improvement of sequence-to-sequence-ba...

Please sign up or login with your details

Forgot password? Click here to reset