Efficient Adaptation of Pretrained Transformers for Abstractive Summarization

06/01/2019
by   Andrew Hoang, et al.
0

Large-scale learning of transformer language models has yielded improvements on a variety of natural language understanding tasks. Whether they can be effectively adapted for summarization, however, has been less explored, as the learned representations are less seamlessly integrated into existing neural text production architectures. In this work, we propose two solutions for efficiently adapting pretrained transformer language models as text summarizers: source embeddings and domain-adaptive training. We test these solutions on three abstractive summarization datasets, achieving new state of the art performance on two of them. Finally, we show that these improvements are achieved by producing more focused summaries with fewer superfluous and that performance improvements are more pronounced on more abstractive datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/21/2022

AraBART: a Pretrained Arabic Sequence-to-Sequence Model for Abstractive Summarization

Like most natural language understanding and generation tasks, state-of-...
research
12/22/2021

Domain Adaptation with Pre-trained Transformers for Query Focused Abstractive Text Summarization

The Query Focused Text Summarization (QFTS) task aims at building system...
research
07/09/2020

Advances of Transformer-Based Models for News Headline Generation

Pretrained language models based on Transformer architecture are the rea...
research
09/14/2023

Clinical Text Summarization: Adapting Large Language Models Can Outperform Human Experts

Sifting through vast textual data and summarizing key information impose...
research
05/18/2023

Generalized Planning in PDDL Domains with Pretrained Large Language Models

Recent work has considered whether large language models (LLMs) can func...
research
10/16/2021

Knowledge Enhanced Pretrained Language Models: A Compreshensive Survey

Pretrained Language Models (PLM) have established a new paradigm through...
research
04/30/2020

TLDR: Extreme Summarization of Scientific Documents

We introduce TLDR generation for scientific papers, a new automatic summ...

Please sign up or login with your details

Forgot password? Click here to reset