Leveraging Locality in Abstractive Text Summarization

05/25/2022
by   Yixin Liu, et al.
0

Despite the successes of neural attention models for natural language generation tasks, the quadratic memory complexity of the self-attention module with respect to the input length hinders their applications in long text summarization. Instead of designing more efficient attention modules, we approach this problem by investigating if models with a restricted context can have competitive performance compared with the memory-efficient attention models that maintain a global context by treating the input as an entire sequence. Our model is applied to individual pages, which contain parts of inputs grouped by the principle of locality, during both encoding and decoding stages. We empirically investigated three kinds of localities in text summarization at different levels, ranging from sentences to documents. Our experimental results show that our model can have better performance compared with strong baseline models with efficient attention modules, and our analysis provides further insights of our locality-aware modeling strategy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/15/2022

Long Document Summarization with Top-down and Bottom-up Inference

Text summarization aims to condense long documents and retain key inform...
research
05/28/2021

Linear-Time Self Attention with Codeword Histogram for Efficient Recommendation

Self-attention has become increasingly popular in a variety of sequence ...
research
10/12/2021

Speech Summarization using Restricted Self-Attention

Speech summarization is typically performed by using a cascade of speech...
research
05/03/2018

A Hierarchical End-to-End Model for Jointly Improving Text Summarization and Sentiment Classification

Text summarization and sentiment classification both aim to capture the ...
research
09/01/2019

Repurposing Decoder-Transformer Language Models for Abstractive Summarization

Neural network models have shown excellent fluency and performance when ...
research
05/05/2023

A Suite of Generative Tasks for Multi-Level Multimodal Webpage Understanding

Webpages have been a rich, scalable resource for vision-language and lan...
research
12/30/2019

Deep Reinforced Self-Attention Masks for Abstractive Summarization (DR.SAS)

We present a novel architectural scheme to tackle the abstractive summar...

Please sign up or login with your details

Forgot password? Click here to reset