Unsupervised Neural Machine Translation with SMT as Posterior Regularization

01/14/2019
by   Shuo Ren, et al.
0

Without real bilingual corpus available, unsupervised Neural Machine Translation (NMT) typically requires pseudo parallel data generated with the back-translation method for the model training. However, due to weak supervision, the pseudo data inevitably contain noises and errors that will be accumulated and reinforced in the subsequent training process, leading to bad translation performance. To address this issue, we introduce phrase based Statistic Machine Translation (SMT) models which are robust to noisy data, as posterior regularizations to guide the training of unsupervised NMT models in the iterative back-translation process. Our method starts from SMT models built with pre-trained language models and word-level translation tables inferred from cross-lingual embeddings. Then SMT and NMT models are optimized jointly and boost each other incrementally in a unified EM framework. In this way, (1) the negative effect caused by errors in the iterative back-translation process can be alleviated timely by SMT filtering noises from its phrase tables; meanwhile, (2) NMT can compensate for the deficiency of fluency inherent in SMT. Experiments conducted on en-fr and en-de translation tasks show that our method outperforms the strong baseline and achieves new state-of-the-art unsupervised machine translation performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/01/2018

Joint Training for Neural Machine Translation Models with Monolingual Data

Monolingual data have been demonstrated to be helpful in improving trans...
research
11/01/2017

Improving Neural Machine Translation through Phrase-based Forced Decoding

Compared to traditional statistical machine translation (SMT), neural ma...
research
11/10/2019

Language Model-Driven Unsupervised Neural Machine Translation

Unsupervised neural machine translation(NMT) is associated with noise an...
research
11/20/2017

Fast BTG-Forest-Based Hierarchical Sub-sentential Alignment

In this paper, we propose a novel BTG-forest-based alignment method. Bas...
research
09/03/2021

Language Modeling, Lexical Translation, Reordering: The Training Process of NMT through the Lens of Classical SMT

Differently from the traditional statistical MT that decomposes the tran...
research
02/28/2020

Do all Roads Lead to Rome? Understanding the Role of Initialization in Iterative Back-Translation

Back-translation provides a simple yet effective approach to exploit mon...
research
11/30/2022

Word Alignment in the Era of Deep Learning: A Tutorial

The word alignment task, despite its prominence in the era of statistica...

Please sign up or login with your details

Forgot password? Click here to reset