Abstractive Text Summarization by Incorporating Reader Comments

12/13/2018
by   Shen Gao, et al.
0

In neural abstractive summarization field, conventional sequence-to-sequence based models often suffer from summarizing the wrong aspect of the document with respect to the main aspect. To tackle this problem, we propose the task of reader-aware abstractive summary generation, which utilizes the reader comments to help the model produce better summary about the main aspect. Unlike traditional abstractive summarization task, reader-aware summarization confronts two main challenges: (1) Comments are informal and noisy; (2) jointly modeling the news document and the reader comments is challenging. To tackle the above challenges, we design an adversarial learning model named reader-aware summary generator (RASG), which consists of four components: (1) a sequence-to-sequence based summary generator; (2) a reader attention module capturing the reader focused aspects; (3) a supervisor modeling the semantic gap between the generated summary and reader focused aspects; (4) a goal tracker producing the goal for each generation step. The supervisor and the goal tacker are used to guide the training of our framework in an adversarial manner. Extensive experiments are conducted on our large-scale real-world text summarization dataset, and the results show that RASG achieves the state-of-the-art performance in terms of both automatic metrics and human evaluations. The experimental results also demonstrate the effectiveness of each module in our framework. We release our large-scale dataset for further research.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/19/2019

How to Write Summaries with Patterns? Learning towards Abstractive Summarization through Prototype Editing

Under special circumstances, summaries should conform to a particular st...
research
08/03/2017

Reader-Aware Multi-Document Summarization: An Enhanced Model and The First Dataset

We investigate the problem of reader-aware multi-document summarization ...
research
09/05/2023

Improving Query-Focused Meeting Summarization with Query-Relevant Knowledge

Query-Focused Meeting Summarization (QFMS) aims to generate a summary of...
research
09/25/2020

Persian Keyphrase Generation Using Sequence-to-Sequence Models

Keyphrases are a very short summary of an input text and provide the mai...
research
06/08/2019

This Email Could Save Your Life: Introducing the Task of Email Subject Line Generation

Given the overwhelming number of emails, an effective subject line becom...
research
04/28/2015

Reader-Aware Multi-Document Summarization via Sparse Coding

We propose a new MDS paradigm called reader-aware multi-document summari...
research
09/13/2021

Augmented Abstractive Summarization With Document-LevelSemantic Graph

Previous abstractive methods apply sequence-to-sequence structures to ge...

Please sign up or login with your details

Forgot password? Click here to reset