Abstractive Summarization Using Attentive Neural Techniques

10/20/2018
by   Jacob Krantz, et al.
0

In a world of proliferating data, the ability to rapidly summarize text is growing in importance. Automatic summarization of text can be thought of as a sequence to sequence problem. Another area of natural language processing that solves a sequence to sequence problem is machine translation, which is rapidly evolving due to the development of attention-based encoder-decoder networks. This work applies these modern techniques to abstractive summarization. We perform analysis on various attention mechanisms for summarization with the goal of developing an approach and architecture aimed at improving the state of the art. In particular, we modify and optimize a translation model with self-attention for generating abstractive sentence summaries. The effectiveness of this base model along with attention variants is compared and analyzed in the context of standardized evaluation sets and test metrics. However, we show that these metrics are limited in their ability to effectively score abstractive summaries, and propose a new approach based on the intuition that an abstractive model requires an abstractive evaluation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/23/2019

MacNet: Transferring Knowledge from Machine Comprehension to Sequence-to-Sequence Models

Machine Comprehension (MC) is one of the core problems in natural langua...
research
02/19/2016

Abstractive Text Summarization Using Sequence-to-Sequence RNNs and Beyond

In this work, we model abstractive text summarization using Attentional ...
research
12/30/2019

Deep Reinforced Self-Attention Masks for Abstractive Summarization (DR.SAS)

We present a novel architectural scheme to tackle the abstractive summar...
research
11/10/2019

Understanding Multi-Head Attention in Abstractive Summarization

Attention mechanisms in deep learning architectures have often been used...
research
05/24/2018

Deep Reinforcement Learning For Sequence to Sequence Models

In recent years, sequence-to-sequence (seq2seq) models are used in a var...
research
11/19/2021

Pointer over Attention: An Improved Bangla Text Summarization Approach Using Hybrid Pointer Generator Network

Despite the success of the neural sequence-to-sequence model for abstrac...
research
06/12/2019

Keeping Notes: Conditional Natural Language Generation with a Scratchpad Mechanism

We introduce the Scratchpad Mechanism, a novel addition to the sequence-...

Please sign up or login with your details

Forgot password? Click here to reset