Meta-Transfer Learning for Low-Resource Abstractive Summarization

02/18/2021
by   Yi-Syuan Chen, et al.
0

Neural abstractive summarization has been studied in many pieces of literature and achieves great success with the aid of large corpora. However, when encountering novel tasks, one may not always benefit from transfer learning due to the domain shifting problem, and overfitting could happen without adequate labeled examples. Furthermore, the annotations of abstractive summarization are costly, which often demand domain knowledge to ensure the ground-truth quality. Thus, there are growing appeals for Low-Resource Abstractive Summarization, which aims to leverage past experience to improve the performance with limited labeled examples of target corpus. In this paper, we propose to utilize two knowledge-rich sources to tackle this problem, which are large pre-trained models and diverse existing corpora. The former can provide the primary ability to tackle summarization tasks; the latter can help discover common syntactic or semantic information to improve the generalization ability. We conduct extensive experiments on various summarization corpora with different writing styles and forms. The results demonstrate that our approach achieves the state-of-the-art on 6 corpora in low-resource scenarios, with only 0.7

READ FULL TEXT
research
03/24/2023

SPEC: Summary Preference Decomposition for Low-Resource Abstractive Summarization

Neural abstractive summarization has been widely studied and achieved gr...
research
03/21/2021

AdaptSum: Towards Low-Resource Domain Adaptation for Abstractive Summarization

State-of-the-art abstractive summarization models generally rely on exte...
research
02/15/2017

Transfer Deep Learning for Low-Resource Chinese Word Segmentation with a Novel Neural Network

Recent studies have shown effectiveness in using neural networks for Chi...
research
11/02/2020

How Domain Terminology Affects Meeting Summarization Performance

Meetings are essential to modern organizations. Numerous meetings are he...
research
04/16/2021

To Share or not to Share: Predicting Sets of Sources for Model Transfer Learning

In low-resource settings, model transfer can help to overcome a lack of ...
research
10/11/2020

CDEvalSumm: An Empirical Study of Cross-Dataset Evaluation for Neural Summarization Systems

Neural network-based models augmented with unsupervised pre-trained know...

Please sign up or login with your details

Forgot password? Click here to reset