Learning to Summarize Long Texts with Memory Compression and Transfer

10/21/2020
by   Jaehong Park, et al.
0

We introduce Mem2Mem, a memory-to-memory mechanism for hierarchical recurrent neural network based encoder decoder architectures and we explore its use for abstractive document summarization. Mem2Mem transfers "memories" via readable/writable external memory modules that augment both the encoder and decoder. Our memory regularization compresses an encoded input article into a more compact set of sentence representations. Most importantly, the memory compression step performs implicit extraction without labels, sidestepping issues with suboptimal ground-truth data and exposure bias of hybrid extractive-abstractive summarization techniques. By allowing the decoder to read/write over the encoded input memory, the model learns to read salient information about the input article while keeping track of what has been generated. Our Mem2Mem approach yields results that are competitive with state of the art transformer based summarization methods, but with 16 times fewer parameters

READ FULL TEXT

page 7

page 14

research
09/12/2018

Closed-Book Training to Improve Summarization Encoder Memory

A good neural sequence-to-sequence summarization model should have a str...
research
04/05/2022

Abstractive summarization of hospitalisation histories with transformer networks

In this paper we present a novel approach to abstractive summarization o...
research
01/30/2018

Generating Wikipedia by Summarizing Long Sequences

We show that generating English Wikipedia articles can be approached as ...
research
12/24/2019

Improving Abstractive Text Summarization with History Aggregation

Recent neural sequence to sequence models have provided feasible solutio...
research
07/26/2018

Variational Memory Encoder-Decoder

Introducing variability while maintaining coherence is a core task in le...
research
09/27/2018

Iterative Document Representation Learning Towards Summarization with Polishing

In this paper, we introduce Iterative Text Summarization (ITS), an itera...
research
02/25/2023

Abstractive Text Summarization using Attentive GRU based Encoder-Decoder

In todays era huge volume of information exists everywhere. Therefore, i...

Please sign up or login with your details

Forgot password? Click here to reset