Improving Abstractive Text Summarization with History Aggregation

12/24/2019
by   Pengcheng Liao, et al.
0

Recent neural sequence to sequence models have provided feasible solutions for abstractive summarization. However, such models are still hard to tackle long text dependency in the summarization task. A high-quality summarization system usually depends on strong encoder which can refine important information from long input texts so that the decoder can generate salient summaries from the encoder's memory. In this paper, we propose an aggregation mechanism based on the Transformer model to address the challenge of long text representation. Our model can review history information to make encoder hold more memory capacity. Empirically, we apply our aggregation mechanism to the Transformer model and experiment on CNN/DailyMail dataset to achieve higher quality summaries compared to several strong baseline models on the ROUGE metrics.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/12/2018

Closed-Book Training to Improve Summarization Encoder Memory

A good neural sequence-to-sequence summarization model should have a str...
research
06/18/2020

SEAL: Segment-wise Extractive-Abstractive Long-form Text Summarization

Most prior work in the sequence-to-sequence paradigm focused on datasets...
research
04/26/2020

Experiments with LVT and FRE for Transformer model

In this paper, we experiment with Large Vocabulary Trick and Feature-ric...
research
04/07/2020

Windowing Models for Abstractive Summarization of Long Texts

Neural summarization models suffer from the fixed-size input limitation:...
research
03/17/2022

HiStruct+: Improving Extractive Text Summarization with Hierarchical Structure Information

Transformer-based language models usually treat texts as linear sequence...
research
02/21/2020

On the impressive performance of randomly weighted encoders in summarization tasks

In this work, we investigate the performance of untrained randomly initi...
research
10/21/2020

Learning to Summarize Long Texts with Memory Compression and Transfer

We introduce Mem2Mem, a memory-to-memory mechanism for hierarchical recu...

Please sign up or login with your details

Forgot password? Click here to reset