Enriching and Controlling Global Semantics for Text Summarization

09/22/2021
by   Thong Nguyen, et al.
0

Recently, Transformer-based models have been proven effective in the abstractive summarization task by creating fluent and informative summaries. Nevertheless, these models still suffer from the short-range dependency problem, causing them to produce summaries that miss the key points of document. In this paper, we attempt to address this issue by introducing a neural topic model empowered with normalizing flow to capture the global semantics of the document, which are then integrated into the summarization model. In addition, to avoid the overwhelming effect of global semantics on contextualized representation, we introduce a mechanism to control the amount of global semantics supplied to the text generation module. Our method outperforms state-of-the-art summarization models on five common text summarization datasets, namely CNN/DailyMail, XSum, Reddit TIFU, arXiv, and PubMed.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/20/2020

Topic-Aware Abstractive Text Summarization

Automatic text summarization aims at condensing a document to a shorter ...
research
10/08/2021

VieSum: How Robust Are Transformer-based Models on Vietnamese Summarization?

Text summarization is a challenging task within natural language process...
research
09/27/2018

Iterative Document Representation Learning Towards Summarization with Polishing

In this paper, we introduce Iterative Text Summarization (ITS), an itera...
research
05/29/2023

Abstractive Summarization as Augmentation for Document-Level Event Detection

Transformer-based models have consistently produced substantial performa...
research
06/09/2016

Neural Network-Based Abstract Generation for Opinions and Arguments

We study the problem of generating abstractive summaries for opinionated...
research
05/03/2020

Knowledge Graph-Augmented Abstractive Summarization with Semantic-Driven Cloze Reward

Sequence-to-sequence models for abstractive summarization have been stud...
research
11/23/2019

Controlling the Amount of Verbatim Copying in Abstractive Summarization

An abstract must not change the meaning of the original text. A single m...

Please sign up or login with your details

Forgot password? Click here to reset