Closed-Book Training to Improve Summarization Encoder Memory

09/12/2018
by   Yichen Jiang, et al.
0

A good neural sequence-to-sequence summarization model should have a strong encoder that can distill and memorize the important information from long input texts so that the decoder can generate salient summaries based on the encoder's memory. In this paper, we aim to improve the memorization capabilities of the encoder of a pointer-generator model by adding an additional 'closed-book' decoder without attention and pointer mechanisms. Such a decoder forces the encoder to be more selective in the information encoded in its memory state because the decoder can't rely on the extra information provided by the attention and possibly copy modules, and hence improves the entire model. On the CNN/Daily Mail dataset, our 2-decoder model outperforms the baseline significantly in terms of ROUGE and METEOR metrics, for both cross-entropy and reinforced setups (and on human evaluation). Moreover, our model also achieves higher scores in a test-only DUC-2002 generalizability setup. We further present a memory ability test, two saliency metrics, as well as several sanity-check ablations (based on fixed-encoder, gradient-flow cut, and model capacity) to prove that the encoder of our 2-decoder model does in fact learn stronger memory representations than the baseline encoder.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/24/2019

Improving Abstractive Text Summarization with History Aggregation

Recent neural sequence to sequence models have provided feasible solutio...
research
10/21/2020

Learning to Summarize Long Texts with Memory Compression and Transfer

We introduce Mem2Mem, a memory-to-memory mechanism for hierarchical recu...
research
04/24/2017

Selective Encoding for Abstractive Sentence Summarization

We propose a selective encoding model to extend the sequence-to-sequence...
research
11/08/2019

Resurrecting Submodularity in Neural Abstractive Summarization

Submodularity is a desirable property for a variety of objectives in sum...
research
05/10/2018

Global Encoding for Abstractive Summarization

In neural abstractive summarization, the conventional sequence-to-sequen...
research
04/07/2020

Windowing Models for Abstractive Summarization of Long Texts

Neural summarization models suffer from the fixed-size input limitation:...
research
09/17/2018

Quantum Statistics-Inspired Neural Attention

Sequence-to-sequence (encoder-decoder) models with attention constitute ...

Please sign up or login with your details

Forgot password? Click here to reset