Semi-Supervised Text Simplification with Back-Translation and Asymmetric Denoising Autoencoders

04/30/2020
by   Yanbin Zhao, et al.
0

Text simplification (TS) rephrases long sentences into simplified variants while preserving inherent semantics. Traditional sequence-to-sequence models heavily rely on the quantity and quality of parallel sentences, which limits their applicability in different languages and domains. This work investigates how to leverage large amounts of unpaired corpora in TS task. We adopt the back-translation architecture in unsupervised machine translation (NMT), including denoising autoencoders for language modeling and automatic generation of parallel data by iterative back-translation. However, it is non-trivial to generate appropriate complex-simple pair if we directly treat the set of simple and complex corpora as two different languages, since the two types of sentences are quite similar and it is hard for the model to capture the characteristics in different types of sentences. To tackle this problem, we propose asymmetric denoising methods for sentences with separate complexity. When modeling simple and complex sentences with autoencoders, we introduce different types of noise into the training process. Such a method can significantly improve the simplification performance. Our model can be trained in both unsupervised and semi-supervised manner. Automatic and human evaluations show that our unsupervised model outperforms the previous systems, and with limited supervision, our model can perform competitively with multiple state-of-the-art simplification systems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/09/2019

Bilingual-GAN: A Step Towards Parallel Text Generation

Latent space based GAN methods and attention based sequence to sequence ...
research
10/30/2017

Unsupervised Neural Machine Translation

In spite of the recent success of neural machine translation (NMT) in st...
research
04/02/2023

Semi-supervised Neural Machine Translation with Consistency Regularization for Low-Resource Languages

The advent of deep learning has led to a significant gain in machine tra...
research
04/20/2018

Phrase-Based & Neural Unsupervised Machine Translation

Machine translation systems achieve near human-level performance on some...
research
03/11/2021

Learning Feature Weights using Reward Modeling for Denoising Parallel Corpora

Large web-crawled corpora represent an excellent resource for improving ...
research
09/24/2019

Transfer Learning across Languages from Someone Else's NMT Model

Neural machine translation is demanding in terms of training time, hardw...

Please sign up or login with your details

Forgot password? Click here to reset