Non-Parallel Text Style Transfer with Self-Parallel Supervision

04/18/2022
by   Ruibo Liu, et al.
3

The performance of existing text style transfer models is severely limited by the non-parallel datasets on which the models are trained. In non-parallel datasets, no direct mapping exists between sentences of the source and target style; the style transfer models thus only receive weak supervision of the target sentences during training, which often leads the model to discard too much style-independent information, or utterly fail to transfer the style. In this work, we propose LaMer, a novel text style transfer framework based on large-scale language models. LaMer first mines the roughly parallel expressions in the non-parallel datasets with scene graphs, and then employs MLE training, followed by imitation learning refinement, to leverage the intrinsic parallelism within the data. On two benchmark tasks (sentiment formality transfer) and a newly proposed challenging task (political stance transfer), our model achieves qualitative advances in transfer accuracy, content preservation, and fluency. Further empirical and human evaluations demonstrate that our model not only makes training more efficient, but also generates more readable and diverse expressions than previous models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/15/2018

Structured Content Preservation for Unsupervised Text Style Transfer

Text style transfer aims to modify the style of a sentence while keeping...
research
08/25/2019

Transforming Delete, Retrieve, Generate Approach for Controlled Text Style Transfer

Text style transfer is the task of transferring the style of text having...
research
05/24/2019

A Dual Reinforcement Learning Framework for Unsupervised Text Style Transfer

Unsupervised text style transfer aims to transfer the underlying style o...
research
09/19/2023

Specializing Small Language Models towards Complex Style Transfer via Latent Attribute Pre-Training

In this work, we introduce the concept of complex text style transfer ta...
research
05/18/2018

Style Obfuscation by Invariance

The task of obfuscating writing style using sequence models has previous...
research
10/27/2020

DGST: a Dual-Generator Network for Text Style Transfer

We propose DGST, a novel and simple Dual-Generator network architecture ...
research
01/15/2021

Empirical Evaluation of Supervision Signals for Style Transfer Models

Text style transfer has gained increasing attention from the research co...

Please sign up or login with your details

Forgot password? Click here to reset