Exploring Multitask Learning for Low-Resource AbstractiveSummarization

09/17/2021
by   Ahmed Magooda, et al.
6

This paper explores the effect of using multitask learning for abstractive summarization in the context of small training corpora. In particular, we incorporate four different tasks (extractive summarization, language modeling, concept detection, and paraphrase detection) both individually and in combination, with the goal of enhancing the target task of abstractive summarization via multitask learning. We show that for many task combinations, a model trained in a multitask setting outperforms a model trained only for abstractive summarization, with no additional summarization data introduced. Additionally, we do a comprehensive search and find that certain tasks (e.g. paraphrase detection) consistently benefit abstractive summarization, not only when combined with other tasks but also when using different architectures and training corpora.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/24/2022

Multitask Learning for Low Resource Spoken Language Understanding

We explore the benefits that multitask learning offer to speech processi...
research
03/25/2023

Identification of Negative Transfers in Multitask Learning Using Surrogate Models

Multitask learning is widely used in practice to train a low-resource ta...
research
05/06/2020

Multitask Models for Supervised Protests Detection in Texts

The CLEF 2019 ProtestNews Lab tasks participants to identify text relati...
research
04/29/2022

Polyglot Prompt: Multilingual Multitask PrompTraining

This paper aims for a potential architectural breakthrough for multiling...
research
02/09/2020

Abstractive Summarization for Low Resource Data using Domain Transfer and Data Synthesis

Training abstractive summarization models typically requires large amoun...
research
04/01/2021

StyleML: Stylometry with Structure and Multitask Learning for Darkweb Markets

Darknet market forums are frequently used to exchange illegal goods and ...
research
01/14/2022

ExtraPhrase: Efficient Data Augmentation for Abstractive Summarization

Neural models trained with large amount of parallel data have achieved i...

Please sign up or login with your details

Forgot password? Click here to reset