Cutting-off Redundant Repeating Generations for Neural Abstractive Summarization

12/31/2016
by   Jun Suzuki, et al.
0

This paper tackles the reduction of redundant repeating generation that is often observed in RNN-based encoder-decoder models. Our basic idea is to jointly estimate the upper-bound frequency of each target vocabulary in the encoder and control the output words based on the estimation in the decoder. Our method shows significant improvement over a strong RNN-based encoder-decoder baseline and achieved its best results on an abstractive summarization benchmark.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/05/2022

Abstractive summarization of hospitalisation histories with transformer networks

In this paper we present a novel approach to abstractive summarization o...
research
09/18/2018

Bidirectional Attentional Encoder-Decoder Model and Bidirectional Beam Search for Abstractive Summarization

Sequence generative models with RNN variants, such as LSTM, GRU, show pr...
research
01/01/2023

Inflected Forms Are Redundant in Question Generation Models

Neural models with an encoder-decoder framework provide a feasible solut...
research
07/04/2016

Towards Abstraction from Extraction: Multiple Timescale Gated Recurrent Unit for Summarization

In this work, we introduce temporal hierarchies to the sequence to seque...
research
09/18/2020

Forecasting time series with encoder-decoder neural networks

In this paper, we consider high-dimensional stationary processes where a...
research
11/09/2019

Enforcing Encoder-Decoder Modularity in Sequence-to-Sequence Models

Inspired by modular software design principles of independence, intercha...
research
07/26/2018

Variational Memory Encoder-Decoder

Introducing variability while maintaining coherence is a core task in le...

Please sign up or login with your details

Forgot password? Click here to reset