Diversity driven Attention Model for Query-based Abstractive Summarization

by   Preksha Nema, et al.

Abstractive summarization aims to generate a shorter version of the document covering all the salient points in a compact and coherent fashion. On the other hand, query-based summarization highlights those points that are relevant in the context of a given query. The encode-attend-decode paradigm has achieved notable success in machine translation, extractive summarization, dialog systems, etc. But it suffers from the drawback of generation of repeated phrases. In this work we propose a model for the query-based summarization task based on the encode-attend-decode paradigm with two key additions (i) a query attention model (in addition to document attention model) which learns to focus on different portions of the query at different time steps (instead of using a static representation for the query) and (ii) a new diversity based attention model which aims to alleviate the problem of repeating phrases in the summary. In order to enable the testing of this model we introduce a new query-based summarization dataset building on debatepedia. Our experiments show that with these two additions the proposed model clearly outperforms vanilla encode-attend-decode models with a gain of 28% (absolute) in ROUGE-L scores.



There are no comments yet.


page 1

page 2

page 3

page 4


QBSUM: a Large-Scale Query-Based Document Summarization Dataset from Real-world Applications

Query-based document summarization aims to extract or generate a summary...

From Standard Summarization to New Tasks and Beyond: Summarization with Manifold Information

Text summarization is the research area aiming at creating a short and c...

Query-adaptive Video Summarization via Quality-aware Relevance Estimation

Although the problem of automatic video summarization has recently recei...

Learning to Predict Repeatability of Interest Points

Many robotics applications require interest points that are highly repea...

Focus Attention: Promoting Faithfulness and Diversity in Summarization

Professional summaries are written with document-level information, such...

In Conclusion Not Repetition: Comprehensive Abstractive Summarization With Diversified Attention Based On Determinantal Point Processes

Various Seq2Seq learning models designed for machine translation were ap...

Attention Optimization for Abstractive Document Summarization

Attention plays a key role in the improvement of sequence-to-sequence-ba...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.