Self-Supervised Learning for Contextualized Extractive Summarization

06/11/2019
by   Hong Wang, et al.
0

Existing models for extractive summarization are usually trained from scratch with a cross-entropy loss, which does not explicitly capture the global context at the document level. In this paper, we aim to improve this task by introducing three auxiliary pre-training tasks that learn to capture the document-level context in a self-supervised fashion. Experiments on the widely-used CNN/DM dataset validate the effectiveness of the proposed auxiliary tasks. Furthermore, we show that after pre-training, a clean model with simple building blocks is able to outperform previous state-of-the-art that are carefully designed.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/18/2019

PEGASUS: Pre-training with Extracted Gap-sentences for Abstractive Summarization

Recent work pre-training Transformers with self-supervised objectives on...
research
05/13/2023

Self-Supervised Sentence Compression for Meeting Summarization

The conventional summarization model often fails to capture critical inf...
research
04/18/2020

Self-Supervised Representation Learning on Document Images

This work analyses the impact of self-supervised pre-training on documen...
research
07/30/2020

Leverage Unlabeled Data for Abstractive Speech Summarization with Self-Supervised Learning and Back-Summarization

Supervised approaches for Neural Abstractive Summarization require large...
research
11/02/2022

Phoneme Segmentation Using Self-Supervised Speech Models

We apply transfer learning to the task of phoneme segmentation and demon...
research
09/05/2021

Re-entry Prediction for Online Conversations via Self-Supervised Learning

In recent years, world business in online discussions and opinion sharin...
research
01/07/2021

Contextual Classification Using Self-Supervised Auxiliary Models for Deep Neural Networks

Classification problems solved with deep neural networks (DNNs) typicall...

Please sign up or login with your details

Forgot password? Click here to reset