Point-less: More Abstractive Summarization with Pointer-Generator Networks

04/18/2019
by   Freek Boutkan, et al.
0

The Pointer-Generator architecture has shown to be a big improvement for abstractive summarization seq2seq models. However, the summaries produced by this model are largely extractive as over 30 copied from the source text. This work proposes a multihead attention mechanism, pointer dropout, and two new loss functions to promote more abstractive summaries while maintaining similar ROUGE scores. Both the multihead attention and dropout do not improve N-gram novelty, however, the dropout acts as a regularizer which improves the ROUGE score. The new loss function achieves significantly higher novel N-grams and sentences, at the cost of a slightly lower ROUGE score.

READ FULL TEXT
research
02/25/2020

A more abstractive summarization model

Pointer-generator network is an extremely popular method of text summari...
research
11/08/2019

Resurrecting Submodularity in Neural Abstractive Summarization

Submodularity is a desirable property for a variety of objectives in sum...
research
02/05/2018

Diverse Beam Search for Increased Novelty in Abstractive Summarization

Text summarization condenses a text to a shorter version while retaining...
research
03/19/2018

Controlling Decoding for More Abstractive Summaries with Copy-Based Networks

Attention-based neural abstractive summarization systems equipped with c...
research
07/02/2019

Cooperative Generator-Discriminator Networks for Abstractive Summarization with Narrative Flow

We introduce Cooperative Generator-Discriminator Networks (Co-opNet), a ...
research
05/21/2021

Uncertainty-Aware Abstractive Summarization

We propose a novel approach to summarization based on Bayesian deep lear...
research
09/28/2020

Reducing Quantity Hallucinations in Abstractive Summarization

It is well-known that abstractive summaries are subject to hallucination...

Please sign up or login with your details

Forgot password? Click here to reset