A Probabilistic Formulation of Unsupervised Text Style Transfer

02/10/2020
by   Junxian He, et al.
0

We present a deep generative model for unsupervised text style transfer that unifies previously proposed non-generative techniques. Our probabilistic approach models non-parallel data from two domains as a partially observed parallel corpus. By hypothesizing a parallel latent sequence that generates each observed sequence, our model learns to transform sequences from one domain to another in a completely unsupervised fashion. In contrast with traditional generative sequence models (e.g. the HMM), our model makes few assumptions about the data it generates: it uses a recurrent language model as a prior and an encoder-decoder as a transduction distribution. While computation of marginal data likelihood is intractable in this model class, we show that amortized variational inference admits a practical surrogate. Further, by drawing connections between our variational objective and other recent unsupervised style transfer and machine translation techniques, we show how our probabilistic view can unify some known non-generative objectives such as backtranslation and adversarial loss. Finally, we demonstrate the effectiveness of our method on a wide range of unsupervised style transfer tasks, including sentiment transfer, formality transfer, word decipherment, author imitation, and related language translation. Across all style transfer tasks, our approach yields substantial gains over state-of-the-art non-generative baselines, including the state-of-the-art unsupervised machine translation techniques that our approach generalizes. Further, we conduct experiments on a standard unsupervised machine translation task and find that our unified approach matches the current state-of-the-art.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/23/2018

Style Transfer as Unsupervised Machine Translation

Language style transferring rephrases text with specific stylistic attri...
research
08/31/2023

Unsupervised Text Style Transfer with Deep Generative Models

We present a general framework for unsupervised text style transfer with...
research
08/22/2019

Unsupervised Text Summarization via Mixed Model Back-Translation

Back-translation based approaches have recently lead to significant prog...
research
05/26/2017

Style Transfer from Non-Parallel Text by Cross-Alignment

This paper focuses on style transfer on the basis of non-parallel text. ...
research
02/08/2020

Blank Language Models

We propose Blank Language Model (BLM), a model that generates sequences ...
research
05/28/2018

A Stochastic Decoder for Neural Machine Translation

The process of translation is ambiguous, in that there are typically man...
research
05/30/2018

Unsupervised Text Style Transfer using Language Models as Discriminators

Binary classifiers are often employed as discriminators in GAN-based uns...

Please sign up or login with your details

Forgot password? Click here to reset