Faithful to the Original: Fact Aware Neural Abstractive Summarization

11/13/2017
by   Ziqiang Cao, et al.
0

Unlike extractive summarization, abstractive summarization has to fuse different parts of the source text, which inclines to create fake facts. Our preliminary study reveals nearly 30 neural summarization system suffer from this problem. While previous abstractive summarization approaches usually focus on the improvement of informativeness, we argue that faithfulness is also a vital prerequisite for a practical abstractive summarization system. To avoid generating fake facts in a summary, we leverage open information extraction and dependency parse technologies to extract actual fact descriptions from the source text. The dual-attention sequence-to-sequence framework is then proposed to force the generation conditioned on both the source text and the extracted fact descriptions. Experiments on the Gigaword benchmark dataset demonstrate that our model can greatly reduce fake summaries by 80 descriptions also bring significant improvement on informativeness since they often condense the meaning of the source text.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/19/2019

Multimodal Abstractive Summarization for How2 Videos

In this paper, we study abstractive summarization for open-domain videos...
research
10/06/2017

A Semantic Relevance Based Neural Network for Text Summarization and Text Simplification

Text summarization and text simplification are two major ways to simplif...
research
10/06/2020

Multi-Fact Correction in Abstractive Text Summarization

Pre-trained neural abstractive summarization systems have dominated extr...
research
12/05/2018

Neural Abstractive Text Summarization with Sequence-to-Sequence Models

In the past few years, neural abstractive text summarization with sequen...
research
05/13/2018

Autoencoder as Assistant Supervisor: Improving Text Representation for Chinese Social Media Text Summarization

Most of the current abstractive text summarization models are based on t...
research
11/23/2019

Controlling the Amount of Verbatim Copying in Abstractive Summarization

An abstract must not change the meaning of the original text. A single m...
research
11/09/2022

Novel Chapter Abstractive Summarization using Spinal Tree Aware Sub-Sentential Content Selection

Summarizing novel chapters is a difficult task due to the input length a...

Please sign up or login with your details

Forgot password? Click here to reset