Named Entity Inclusion in Abstractive Text Summarization

07/05/2023
by   Sergey Berezin, et al.
0

We address the named entity omission - the drawback of many current abstractive text summarizers. We suggest a custom pretraining objective to enhance the model's attention on the named entities in a text. At first, the named entity recognition model RoBERTa is trained to determine named entities in the text. After that, this model is used to mask named entities in the text and the BART model is trained to reconstruct them. Next, the BART model is fine-tuned on the summarization task. Our experiments showed that this pretraining approach improves named entity inclusion precision and recall metrics.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/30/2021

Chemical Identification and Indexing in PubMed Articles via BERT and Text-to-Text Approaches

The Biocreative VII Track-2 challenge consists of named entity recogniti...
research
07/02/2021

Concept Identification of Directly and Indirectly Related Mentions Referring to Groups of Persons

Unsupervised concept identification through clustering, i.e., identifica...
research
01/08/2018

Term Relevance Feedback for Contextual Named Entity Retrieval

We address the role of a user in Contextual Named Entity Retrieval (CNER...
research
03/14/2022

WCL-BBCD: A Contrastive Learning and Knowledge Graph Approach to Named Entity Recognition

Named Entity Recognition task is one of the core tasks of information ex...
research
04/25/2023

Hypernymization of named entity-rich captions for grounding-based multi-modal pretraining

Named entities are ubiquitous in text that naturally accompanies images,...
research
06/28/2017

Named Entity Disambiguation for Noisy Text

We address the task of Named Entity Disambiguation (NED) for noisy text....

Please sign up or login with your details

Forgot password? Click here to reset