Novel Chapter Abstractive Summarization using Spinal Tree Aware Sub-Sentential Content Selection

11/09/2022
by   Hardy Hardy, et al.
0

Summarizing novel chapters is a difficult task due to the input length and the fact that sentences that appear in the desired summaries draw content from multiple places throughout the chapter. We present a pipelined extractive-abstractive approach where the extractive step filters the content that is passed to the abstractive component. Extremely lengthy input also results in a highly skewed dataset towards negative instances for extractive summarization; we thus adopt a margin ranking loss for extraction to encourage separation between positive and negative examples. Our extraction component operates at the constituent level; our approach to this problem enriches the text with spinal tree information which provides syntactic context (in the form of constituents) to the extraction model. We show an improvement of 3.71 Rouge-1 points over best results reported in prior work on an existing novel chapter dataset.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/04/2020

Exploring Content Selection in Summarization of Novel Chapters

We present a new summarization task, generating summaries of novel chapt...
research
11/27/2021

An analysis of document graph construction methods for AMR summarization

Meaning Representation (AMR) is a graph-based semantic representation fo...
research
02/03/2019

Neural Extractive Text Summarization with Syntactic Compression

Recent neural network approaches to summarization are largely either sen...
research
08/31/2018

Bottom-Up Abstractive Summarization

Neural network-based methods for abstractive summarization produce outpu...
research
08/06/2019

Text Summarization in the Biomedical Domain

This chapter gives an overview of recent advances in the field of biomed...
research
04/06/2020

At Which Level Should We Extract? An Empirical Study on Extractive Document Summarization

Extractive methods have proven to be very effective in automatic documen...
research
11/13/2017

Faithful to the Original: Fact Aware Neural Abstractive Summarization

Unlike extractive summarization, abstractive summarization has to fuse d...

Please sign up or login with your details

Forgot password? Click here to reset