Topic-Aware Abstractive Text Summarization

10/20/2020
by   Chujie Zheng, et al.
0

Automatic text summarization aims at condensing a document to a shorter version while preserving the key information. Different from extractive summarization which simply selects text fragments from the document, abstractive summarization generates the summary in a word-by-word manner. Most current state-of-the-art (SOTA) abstractive summarization methods are based on the Transformer-based encoder-decoder architecture and focus on novel self-supervised objectives in pre-training. While these models well capture the contextual information among words in documents, little attention has been paid to incorporating global semantics to better fine-tune for the downstream abstractive summarization task. In this study, we propose a topic-aware abstractive summarization (TAAS) framework by leveraging the underlying semantic structure of documents represented by their latent topics. Specifically, TAAS seamlessly incorporates a neural topic modeling into an encoder-decoder based sequence generation procedure via attention for summarization. This design is able to learn and preserve global semantics of documents and thus makes summarization effective, which has been proved by our experiments on real-world datasets. As compared to several cutting-edge baseline methods, we show that TAAS outperforms BART, a well-recognized SOTA model, by 2 ROUGE-1, ROUGE-2, and ROUGE-L, respectively. TAAS also achieves comparable performance to PEGASUS and ProphetNet, which is difficult to accomplish given that training PEGASUS and ProphetNet requires enormous computing capacity beyond what we used in this study.

READ FULL TEXT

page 8

page 9

research
12/17/2021

Topic-Aware Encoding for Extractive Summarization

Document summarization provides an instrument for faster understanding t...
research
09/22/2021

Enriching and Controlling Global Semantics for Text Summarization

Recently, Transformer-based models have been proven effective in the abs...
research
12/18/2019

PEGASUS: Pre-training with Extracted Gap-sentences for Abstractive Summarization

Recent work pre-training Transformers with self-supervised objectives on...
research
08/19/2019

Topic Augmented Generator for Abstractive Summarization

Steady progress has been made in abstractive summarization with attentio...
research
08/19/2018

Adapting the Neural Encoder-Decoder Framework from Single to Multi-Document Summarization

Generating an abstract from a set of relevant documents remains challeng...
research
09/17/2019

Extractive Summarization of Long Documents by Combining Global and Local Context

In this paper, we propose a novel neural single document extractive summ...
research
09/19/2017

MetaLDA: a Topic Model that Efficiently Incorporates Meta information

Besides the text content, documents and their associated words usually c...

Please sign up or login with your details

Forgot password? Click here to reset