Robust Neural Machine Translation: Modeling Orthographic and Interpunctual Variation

09/11/2020
by   Toms Bergmanis, et al.
0

Neural machine translation systems typically are trained on curated corpora and break when faced with non-standard orthography or punctuation. Resilience to spelling mistakes and typos, however, is crucial as machine translation systems are used to translate texts of informal origins, such as chat conversations, social media posts and web pages. We propose a simple generative noise model to generate adversarial examples of ten different types. We use these to augment machine translation systems' training data and show that, when tested on noisy data, systems trained using adversarial examples perform almost as well as when translating clean data, while baseline systems' performance drops by 2-3 BLEU points. To measure the robustness and noise invariance of machine translation systems' outputs, we use the average translation edit rate between the translation of the original sentence and its noised variants. Using this measure, we show that systems trained on adversarial examples on average yield 50 data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/06/2019

Robust Neural Machine Translation with Doubly Adversarial Inputs

Neural machine translation (NMT) often suffers from the vulnerability to...
research
05/31/2018

On the Impact of Various Types of Noise on Neural Machine Translation

We examine how various types of noise in the parallel training data impa...
research
02/28/2020

Robust Unsupervised Neural Machine Translation with Adversarial Training

Unsupervised neural machine translation (UNMT) has recently attracted gr...
research
06/23/2018

On Adversarial Examples for Character-Level Neural Machine Translation

Evaluating on adversarial examples has become a standard procedure to me...
research
11/06/2017

Synthetic and Natural Noise Both Break Neural Machine Translation

Character-based neural machine translation (NMT) models alleviate out-of...
research
06/19/2019

Robust Machine Translation with Domain Sensitive Pseudo-Sources: Baidu-OSU WMT19 MT Robustness Shared Task System Report

This paper describes the machine translation system developed jointly by...
research
04/17/2018

Adversarial Example Generation with Syntactically Controlled Paraphrase Networks

We propose syntactically controlled paraphrase networks (SCPNs) and use ...

Please sign up or login with your details

Forgot password? Click here to reset