Entity Commonsense Representation for Neural Abstractive Summarization

06/14/2018
by   Reinald Kim Amplayo, et al.
0

A major proportion of a text summary includes important entities found in the original text. These entities build up the topic of the summary. Moreover, they hold commonsense information once they are linked to a knowledge base. Based on these observations, this paper investigates the usage of linked entities to guide the decoder of a neural text summarizer to generate concise and better summaries. To this end, we leverage on an off-the-shelf entity linking system (ELS) to extract linked entities and propose Entity2Topic (E2T), a module easily attachable to a sequence-to-sequence model that transforms a list of entities into a vector representation of the topic of the summary. Current available ELS's are still not sufficiently effective, possibly introducing unresolved ambiguities and irrelevant entities. We resolve the imperfections of the ELS by (a) encoding entities with selective disambiguation, and (b) pooling entity vectors using firm attention. By applying E2T to a simple sequence-to-sequence model with attention mechanism as base model, we see significant improvements of the performance in the Gigaword (sentence to title) and CNN (long document to multi-sentence highlights) summarization datasets by at least 2 ROUGE points.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/17/2021

Topic-Aware Encoding for Extractive Summarization

Document summarization provides an instrument for faster understanding t...
research
09/13/2021

Augmented Abstractive Summarization With Document-LevelSemantic Graph

Previous abstractive methods apply sequence-to-sequence structures to ge...
research
04/28/2022

Faithful to the Document or to the World? Mitigating Hallucinations via Entity-linked Knowledge in Abstractive Summarization

Despite recent advances in abstractive summarization, current summarizat...
research
04/15/2021

Planning with Entity Chains for Abstractive Summarization

Pre-trained transformer-based sequence-to-sequence models have become th...
research
10/29/2019

Contrastive Attention Mechanism for Abstractive Sentence Summarization

We propose a contrastive attention mechanism to extend the sequence-to-s...
research
09/13/2022

Entity Tagging: Extracting Entities in Text Without Mention Supervision

Detection and disambiguation of all entities in text is a crucial task f...
research
04/12/2012

Leveraging Usage Data for Linked Data Movie Entity Summarization

Novel research in the field of Linked Data focuses on the problem of ent...

Please sign up or login with your details

Forgot password? Click here to reset