Entity-driven Fact-aware Abstractive Summarization of Biomedical Literature

03/30/2022
by   Amanuel Alambo, et al.
0

As part of the large number of scientific articles being published every year, the publication rate of biomedical literature has been increasing. Consequently, there has been considerable effort to harness and summarize the massive amount of biomedical research articles. While transformer-based encoder-decoder models in a vanilla source document-to-summary setting have been extensively studied for abstractive summarization in different domains, their major limitations continue to be entity hallucination (a phenomenon where generated summaries constitute entities not related to or present in source article(s)) and factual inconsistency. This problem is exacerbated in a biomedical setting where named entities and their semantics (which can be captured through a knowledge base) constitute the essence of an article. The use of named entities and facts mined from background knowledge bases pertaining to the named entities to guide abstractive summarization has not been studied in biomedical article summarization literature. In this paper, we propose an entity-driven fact-aware framework for training end-to-end transformer-based encoder-decoder models for abstractive summarization of biomedical articles. We call the proposed approach, whose building block is a transformer-based model, EFAS, Entity-driven Fact-aware Abstractive Summarization. We conduct experiments using five state-of-the-art transformer-based models (two of which are specifically designed for long document summarization) and demonstrate that injecting knowledge into the training/inference phase of these models enables the models to achieve significantly better performance than the standard source document-to-summary setting in terms of entity-level factual accuracy, N-gram novelty, and semantic equivalence while performing comparably on ROUGE metrics. The proposed approach is evaluated on ICD-11-Summ-1000, and PubMed-50k.

READ FULL TEXT
research
06/27/2020

Mind The Facts: Knowledge-Boosted Coherent Abstractive Text Summarization

Neural models have become successful at producing abstractive summaries ...
research
04/21/2021

Text Summarization of Czech News Articles Using Named Entities

The foundation for the research of summarization in the Czech language w...
research
04/02/2022

Improving the Factual Accuracy of Abstractive Clinical Text Summarization using Multi-Objective Optimization

While there has been recent progress in abstractive summarization as app...
research
04/28/2022

Faithful to the Document or to the World? Mitigating Hallucinations via Entity-linked Knowledge in Abstractive Summarization

Despite recent advances in abstractive summarization, current summarizat...
research
05/14/2023

FactKB: Generalizable Factuality Evaluation using Language Models Enhanced with Factual Knowledge

Evaluating the factual consistency of automatically generated summaries ...
research
12/06/2022

KATSum: Knowledge-aware Abstractive Text Summarization

Text Summarization is recognised as one of the NLP downstream tasks and ...
research
05/03/2020

Knowledge Graph-Augmented Abstractive Summarization with Semantic-Driven Cloze Reward

Sequence-to-sequence models for abstractive summarization have been stud...

Please sign up or login with your details

Forgot password? Click here to reset