Improving Sequence-to-Sequence Pre-training via Sequence Span Rewriting

01/02/2021
by   Wangchunshu Zhou, et al.
0

In this paper, we generalize text infilling (e.g., masked language models) by proposing Sequence Span Rewriting (SSR) as a self-supervised sequence-to-sequence (seq2seq) pre-training objective. SSR provides more fine-grained learning signals for text representations by supervising the model to rewrite imperfect spans to ground truth, and it is more consistent than text infilling with many downstream seq2seq tasks that rewrite a source sentences into a target sentence. Our experiments with T5 models on various seq2seq tasks show that SSR can substantially improve seq2seq pre-training. Moreover, we observe SSR is especially helpful to improve pre-training a small-size seq2seq model with a powerful imperfect span generator, which indicates a new perspective of transferring knowledge from a large model to a smaller model for seq2seq pre-training.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/31/2021

Effective Sequence-to-Sequence Dialogue State Tracking

Sequence-to-sequence models have been applied to a wide variety of NLP t...
research
02/15/2021

MAPGN: MAsked Pointer-Generator Network for sequence-to-sequence pre-training

This paper presents a self-supervised learning method for pointer-genera...
research
10/12/2020

Improving Self-supervised Pre-training via a Fully-Explored Masked Language Model

Masked Language Model (MLM) framework has been widely adopted for self-s...
research
07/24/2019

SpanBERT: Improving Pre-training by Representing and Predicting Spans

We present SpanBERT, a pre-training method that is designed to better re...
research
10/19/2020

An Empirical Study for Vietnamese Constituency Parsing with Pre-training

In this work, we use a span-based approach for Vietnamese constituency p...
research
06/01/2023

On Masked Pre-training and the Marginal Likelihood

Masked pre-training removes random input dimensions and learns a model t...
research
06/27/2021

WVOQ at SemEval-2021 Task 6: BART for Span Detection and Classification

A novel solution to span detection and classification is presented in wh...

Please sign up or login with your details

Forgot password? Click here to reset