Generic resources are what you need: Style transfer tasks without task-specific parallel training data

09/09/2021
by   Huiyuan Lai, et al.
0

Style transfer aims to rewrite a source text in a different target style while preserving its content. We propose a novel approach to this task that leverages generic resources, and without using any task-specific parallel (source-target) data outperforms existing unsupervised approaches on the two most popular style transfer tasks: formality transfer and polarity swap. In practice, we adopt a multi-step procedure which builds on a generic pre-trained sequence-to-sequence model (BART). First, we strengthen the model's ability to rewrite by further pre-training BART on both an existing collection of generic paraphrases, as well as on synthetic pairs created using a general-purpose lexical resource. Second, through an iterative back-translation approach, we train two models, each in a transfer direction, so that they can provide each other with synthetically generated pairs, dynamically in the training process. Lastly, we let our best reresulting model generate static synthetic pairs to be used in a supervised training regime. Besides methodology and state-of-the-art results, a core contribution of this work is a reflection on the nature of the two tasks we address, and how their differences are highlighted by their response to our approach.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/31/2019

Unsupervised Text Style Transfer via Iterative Matching and Translation

Text style transfer seeks to learn how to automatically rewrite sentence...
research
03/16/2022

Multilingual Pre-training with Language and Task Adaptation for Multilingual Text Style Transfer

We exploit the pre-trained seq2seq model mBART for multilingual text sty...
research
05/14/2021

Thank you BART! Rewarding Pre-Trained Models Improves Formality Style Transfer

Scarcity of parallel data causes formality style transfer models to have...
research
10/02/2020

Unsupervised Text Style Transfer with Padded Masked Language Models

We propose Masker, an unsupervised text-editing method for style transfe...
research
05/05/2020

Exploring Contextual Word-level Style Relevance for Unsupervised Style Transfer

Unsupervised style transfer aims to change the style of an input sentenc...
research
04/29/2020

Interactive Video Stylization Using Few-Shot Patch-Based Training

In this paper, we present a learning-based method to the keyframe-based ...
research
02/25/2019

EAT2seq: A generic framework for controlled sentence transformation without task-specific training

We present EAT2seq: a novel method to architect automatic linguistic tra...

Please sign up or login with your details

Forgot password? Click here to reset