HETFORMER: Heterogeneous Transformer with Sparse Attention for Long-Text Extractive Summarization

10/12/2021
by   Ye Liu, et al.
0

To capture the semantic graph structure from raw text, most existing summarization approaches are built on GNNs with a pre-trained model. However, these methods suffer from cumbersome procedures and inefficient computations for long-text documents. To mitigate these issues, this paper proposes HETFORMER, a Transformer-based pre-trained model with multi-granularity sparse attentions for long-text extractive summarization. Specifically, we model different types of semantic nodes in raw text as a potential heterogeneous graph and directly learn heterogeneous relationships (edges) among nodes by Transformer. Extensive experiments on both single- and multi-document summarization tasks show that HETFORMER achieves state-of-the-art performance in Rouge F1 while using less memory and fewer parameters.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/08/2021

Long-Span Dependencies in Transformer-based Summarization Systems

Transformer-based models have achieved state-of-the-art results in a wid...
research
04/26/2020

Heterogeneous Graph Neural Networks for Extractive Document Summarization

As a crucial step in extractive document summarization, learning cross-s...
research
09/09/2021

ARMAN: Pre-training with Semantically Selecting and Reordering of Sentences for Persian Abstractive Summarization

Abstractive text summarization is one of the areas influenced by the eme...
research
10/10/2021

On Automatic Text Extractive Summarization Based on Graph and pre-trained Language Model Attention

Representing text as graph to solve the summarization task has been disc...
research
10/16/2021

PRIMER: Pyramid-based Masked Sentence Pre-training for Multi-document Summarization

Recently proposed pre-trained generation models achieve strong performan...
research
10/09/2022

HEGEL: Hypergraph Transformer for Long Document Summarization

Extractive summarization for long documents is challenging due to the ex...
research
12/29/2020

SIT3: Code Summarization with Structure-Induced Transformer

Code summarization (CS) is becoming a promising area in recent natural l...

Please sign up or login with your details

Forgot password? Click here to reset