GTAE: Graph-Transformer based Auto-Encoders for Linguistic-Constrained Text Style Transfer

02/01/2021
by   Yukai Shi, et al.
0

Non-parallel text style transfer has attracted increasing research interests in recent years. Despite successes in transferring the style based on the encoder-decoder framework, current approaches still lack the ability to preserve the content and even logic of original sentences, mainly due to the large unconstrained model space or too simplified assumptions on latent embedding space. Since language itself is an intelligent product of humans with certain grammars and has a limited rule-based model space by its nature, relieving this problem requires reconciling the model capacity of deep neural networks with the intrinsic model constraints from human linguistic rules. To this end, we propose a method called Graph Transformer based Auto Encoder (GTAE), which models a sentence as a linguistic graph and performs feature extraction and style transfer at the graph level, to maximally retain the content and the linguistic structure of original sentences. Quantitative experiment results on three non-parallel text style transfer tasks show that our model outperforms state-of-the-art methods in content preservation, while achieving comparable performance on transfer accuracy and sentence naturalness.

READ FULL TEXT
research
12/19/2022

StyleFlow: Disentangle Latent Representations via Normalizing Flow for Unsupervised Text Style Transfer

Text style transfer aims to alter the style of a sentence while preservi...
research
05/14/2019

Style Transformer: Unpaired Text Style Transfer without Disentangled Latent Representation

Disentangling the content and style in the latent space is prevalent in ...
research
08/29/2022

StoryTrans: Non-Parallel Story Author-Style Transfer with Discourse Representations and Content Enhancing

Non-parallel text style transfer is an important task in natural languag...
research
11/08/2019

Low-Level Linguistic Controls for Style Transfer and Content Preservation

Despite the success of style transfer in image processing, it has seen l...
research
12/03/2022

T-STAR: Truthful Style Transfer using AMR Graph as Intermediate Representation

Unavailability of parallel corpora for training text style transfer (TST...
research
05/04/2022

Towards Robust and Semantically Organised Latent Representations for Unsupervised Text Style Transfer

Recent studies show that auto-encoder based approaches successfully perf...
research
05/29/2019

Revision in Continuous Space: Fine-Grained Control of Text Style Transfer

Typical methods for unsupervised text style transfer often rely on two k...

Please sign up or login with your details

Forgot password? Click here to reset