Transfer Learning for Abstractive Summarization at Controllable Budgets

02/18/2020
by   Ritesh Sarkhel, et al.
0

Summarizing a document within an allocated budget while maintaining its major concepts is a challenging task. If the budget can take any arbitrary value and not known beforehand, it becomes even more difficult. Most of the existing methods for abstractive summarization, including state-of-the-art neural networks are data intensive. If the number of available training samples becomes limited, they fail to construct high-quality summaries. We propose MLS, an end-to-end framework to generate abstractive summaries with limited training data at arbitrary compression budgets. MLS employs a pair of supervised sequence-to-sequence networks. The first network called the MFS-Net constructs a minimal feasible summary by identifying the key concepts of the input document. The second network called the Pointer-Magnifier then generates the final summary from the minimal feasible summary by leveraging an interpretable multi-headed attention model. Experiments on two cross-domain datasets show that MLS outperforms baseline methods over a range of success metrics including ROUGE and METEOR. We observed an improvement of approximately 4 budgets. Results from a human evaluation study also establish the effectiveness of MLS in generating complete coherent summaries at arbitrary compression budgets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/08/2015

Multi-Document Summarization via Discriminative Summary Reranking

Existing multi-document summarization systems usually rely on a specific...
research
12/29/2020

Abstractive Query Focused Summarization with Query-Free Resources

The availability of large-scale datasets has driven the development of n...
research
04/13/2020

A Divide-and-Conquer Approach to the Summarization of Academic Articles

We present a novel divide-and-conquer method for the summarization of lo...
research
07/17/2020

SummPip: Unsupervised Multi-Document Summarization with Sentence Graph Compression

Obtaining training data for multi-document summarization (MDS) is time c...
research
10/18/2019

Concept Pointer Network for Abstractive Summarization

A quality abstractive summary should not only copy salient source texts ...
research
03/02/2021

Data Augmentation for Abstractive Query-Focused Multi-Document Summarization

The progress in Query-focused Multi-Document Summarization (QMDS) has be...
research
12/22/2020

NetReAct: Interactive Learning for Network Summarization

Generating useful network summaries is a challenging and important probl...

Please sign up or login with your details

Forgot password? Click here to reset