Cycle-Consistent Adversarial Autoencoders for Unsupervised Text Style Transfer

10/02/2020
by   Yufang Huang, et al.
0

Unsupervised text style transfer is full of challenges due to the lack of parallel data and difficulties in content preservation. In this paper, we propose a novel neural approach to unsupervised text style transfer, which we refer to as Cycle-consistent Adversarial autoEncoders (CAE) trained from non-parallel data. CAE consists of three essential components: (1) LSTM autoencoders that encode a text in one style into its latent representation and decode an encoded representation into its original text or a transferred representation into a style-transferred text, (2) adversarial style transfer networks that use an adversarially trained generator to transform a latent representation in one style into a representation in another style, and (3) a cycle-consistent constraint that enhances the capacity of the adversarial style transfer networks in content preservation. The entire CAE with these three components can be trained end-to-end. Extensive experiments and in-depth analyses on two widely-used public datasets consistently validate the effectiveness of proposed CAE in both style transfer and content preservation against several strong baselines in terms of four automatic evaluation metrics and human evaluation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/18/2017

Style Transfer in Text: Exploration and Evaluation

Style transfer is an important problem in natural language processing (N...
research
12/19/2022

StyleFlow: Disentangle Latent Representations via Normalizing Flow for Unsupervised Text Style Transfer

Text style transfer aims to alter the style of a sentence while preservi...
research
06/27/2019

Latent Optimization for Non-adversarial Representation Disentanglement

Disentanglement between pose and content is a key task for artificial in...
research
09/13/2019

A Neural Approach to Irony Generation

Ironies can not only express stronger emotions but also show a sense of ...
research
04/10/2018

Sentiment Transfer using Seq2Seq Adversarial Autoencoders

Expressing in language is subjective. Everyone has a different style of ...
research
06/04/2021

NAST: A Non-Autoregressive Generator with Word Alignment for Unsupervised Text Style Transfer

Autoregressive models have been widely used in unsupervised text style t...
research
08/31/2023

Unsupervised Text Style Transfer with Deep Generative Models

We present a general framework for unsupervised text style transfer with...

Please sign up or login with your details

Forgot password? Click here to reset