A more abstractive summarization model

02/25/2020
by   Satyaki Chakraborty, et al.
0

Pointer-generator network is an extremely popular method of text summarization. More recent works in this domain still build on top of the baseline pointer generator by augmenting a content selection phase, or by decomposing the decoder into a contextual network and a language model. However, all such models that are based on the pointer-generator base architecture cannot generate novel words in the summary and mostly copy words from the source text. In our work, we first thoroughly investigate why the pointer-generator network is unable to generate novel words, and then address that by adding an Out-of-vocabulary (OOV) penalty. This enables us to improve the amount of novelty/abstraction significantly. We use normalized n-gram novelty scores as a metric for determining the level of abstraction. Moreover, we also report rouge scores of our model since most summarization models are evaluated with R-1, R-2, R-L scores.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/23/2018

Improving Abstraction in Text Summarization

Abstractive text summarization aims to shorten long text documents into ...
research
04/18/2019

Point-less: More Abstractive Summarization with Pointer-Generator Networks

The Pointer-Generator architecture has shown to be a big improvement for...
research
06/03/2021

To Point or Not to Point: Understanding How Abstractive Summarizers Paraphrase Text

Abstractive neural summarization models have seen great improvements in ...
research
04/14/2017

Get To The Point: Summarization with Pointer-Generator Networks

Neural sequence-to-sequence models have provided a viable new approach f...
research
05/31/2021

Reinforced Generative Adversarial Network for Abstractive Text Summarization

Sequence-to-sequence models provide a viable new approach to generative ...
research
11/19/2021

Pointer over Attention: An Improved Bangla Text Summarization Approach Using Hybrid Pointer Generator Network

Despite the success of the neural sequence-to-sequence model for abstrac...
research
05/22/2023

Learning to Rank Utterances for Query-Focused Meeting Summarization

Query-focused meeting summarization(QFMS) aims to generate a specific su...

Please sign up or login with your details

Forgot password? Click here to reset