DGST: a Dual-Generator Network for Text Style Transfer

10/27/2020
by   Xiao Li, et al.
6

We propose DGST, a novel and simple Dual-Generator network architecture for text Style Transfer. Our model employs two generators only, and does not rely on any discriminators or parallel corpus for training. Both quantitative and qualitative experiments on the Yelp and IMDb datasets show that our model gives competitive performance compared to several strong baselines with more complicated architecture designs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/26/2019

Reinforcement Learning Based Text Style Transfer without Parallel Training Corpus

Text style transfer rephrases a text from a source style (e.g., informal...
research
04/18/2022

Non-Parallel Text Style Transfer with Self-Parallel Supervision

The performance of existing text style transfer models is severely limit...
research
03/17/2018

Dear Sir or Madam, May I introduce the YAFC Corpus: Corpus, Benchmarks and Metrics for Formality Style Transfer

Style transfer is the task of automatically transforming a piece of text...
research
03/17/2018

Dear Sir or Madam, May I introduce the GYAFC Dataset: Corpus, Benchmarks and Metrics for Formality Style Transfer

Style transfer is the task of automatically transforming a piece of text...
research
04/27/2021

SE-DAE: Style-Enhanced Denoising Auto-Encoder for Unsupervised Text Style Transfer

Text style transfer aims to change the style of sentences while preservi...
research
10/06/2020

Histopathological Stain Transfer using Style Transfer Network with Adversarial Loss

Deep learning models that are trained on histopathological images obtain...
research
12/20/2022

SimpleStyle: An Adaptable Style Transfer Approach

Attribute-controlled text rewriting, also known as text style-transfer, ...

Please sign up or login with your details

Forgot password? Click here to reset