Style Transformer: Unpaired Text Style Transfer without Disentangled Latent Representation

05/14/2019
by   Ning Dai, et al.
0

Disentangling the content and style in the latent space is prevalent in unpaired text style transfer. However, two major issues exist in most of the current neural models. 1) It is difficult to completely strip the style information from the semantics for a sentence. 2) The recurrent neural network (RNN) based encoder and decoder, mediated by the latent representation, cannot well deal with the issue of the long-term dependency, resulting in poor preservation of non-stylistic semantic content.In this paper, we propose the Style Transformer, which makes no assumption about the latent representation of source sentence and equips the power of attention mechanism in Transformer to achieve better style transfer and better content preservation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/01/2021

Enhancing Content Preservation in Text Style Transfer Using Reverse Attention and Conditional Layer Normalization

Text style transfer aims to alter the style (e.g., sentiment) of a sente...
research
08/13/2018

Disentangled Representation Learning for Text Style Transfer

This paper tackles the problem of disentangling the latent variables of ...
research
08/19/2022

Dance Style Transfer with Cross-modal Transformer

We present CycleDance, a dance style transfer system to transform an exi...
research
02/01/2021

GTAE: Graph-Transformer based Auto-Encoders for Linguistic-Constrained Text Style Transfer

Non-parallel text style transfer has attracted increasing research inter...
research
02/24/2020

Learning to Select Bi-Aspect Information for Document-Scale Text Content Manipulation

In this paper, we focus on a new practical task, document-scale text con...
research
05/09/2023

Style-A-Video: Agile Diffusion for Arbitrary Text-based Video Style Transfer

Large-scale text-to-video diffusion models have demonstrated an exceptio...
research
01/21/2022

Text Style Transfer for Bias Mitigation using Masked Language Modeling

It is well known that textual data on the internet and other digital pla...

Please sign up or login with your details

Forgot password? Click here to reset