A Discourse-Aware Attention Model for Abstractive Summarization of Long Documents

04/16/2018
by   Arman Cohan, et al.
0

Neural abstractive summarization models have led to promising results in summarizing relatively short documents. We propose the first model for abstractive summarization of single, longer-form documents (e.g., research papers). Our approach consists of a new hierarchical encoder that models the discourse structure of a document, and an attentive discourse-aware decoder to generate the summary. Empirical results on two large-scale datasets of scientific papers show that our model significantly outperforms state-of-the-art models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/18/2022

GoSum: Extractive Summarization of Long Documents by Reinforcement Learning and Graph Organized discourse state

Handling long texts with structural information and excluding redundancy...
research
12/07/2020

Dialogue Discourse-Aware Graph Convolutional Networks for Abstractive Meeting Summarization

Sequence-to-sequence methods have achieved promising results for textual...
research
09/17/2019

Extractive Summarization of Long Documents by Combining Global and Local Context

In this paper, we propose a novel neural single document extractive summ...
research
11/16/2019

Improved Document Modelling with a Neural Discourse Parser

Despite the success of attention-based neural models for natural languag...
research
05/31/2023

Contrastive Hierarchical Discourse Graph for Scientific Document Summarization

The extended structural context has made scientific paper summarization ...
research
07/03/2020

Abstractive and mixed summarization for long-single documents

The lack of diversity in the datasets available for automatic summarizat...
research
04/14/2021

Predicting Discourse Trees from Transformer-based Neural Summarizers

Previous work indicates that discourse information benefits summarizatio...

Please sign up or login with your details

Forgot password? Click here to reset