Log In Sign Up

Online Versus Offline NMT Quality: An In-depth Analysis on English-German and German-English

by   Maha Elbayad, et al.

We conduct in this work an evaluation study comparing offline and online neural machine translation architectures. Two sequence-to-sequence models: convolutional Pervasive Attention (Elbayad et al. 2018) and attention-based Transformer (Vaswani et al. 2017) are considered. We investigate, for both architectures, the impact of online decoding constraints on the translation quality through a carefully designed human evaluation on English-German and German-English language pairs, the latter being particularly sensitive to latency constraints. The evaluation results allow us to identify the strengths and shortcomings of each model when we shift to the online setup.


page 7

page 8


EDITOR: an Edit-Based Transformer with Repositioning for Neural Machine Translation with Soft Lexical Constraints

We introduce an Edit-Based Transformer with Repositioning (EDITOR), whic...

Recursive Top-Down Production for Sentence Generation with Latent Trees

We model the recursive production property of context-free grammars for ...

Neural Machine Translation with Joint Representation

Though early successes of Statistical Machine Translation (SMT) systems ...

A Comparison of Neural Models for Word Ordering

We compare several language models for the word-ordering task and propos...

The Best of Both Worlds: Combining Recent Advances in Neural Machine Translation

The past year has witnessed rapid advances in sequence-to-sequence (seq2...

Reassessing Claims of Human Parity and Super-Human Performance in Machine Translation at WMT 2019

We reassess the claims of human parity and super-human performance made ...

The Evolved Transformer

Recent works have highlighted the strengths of the Transformer architect...