DeepAI
Log In Sign Up

Online Versus Offline NMT Quality: An In-depth Analysis on English-German and German-English

06/01/2020
by   Maha Elbayad, et al.
0

We conduct in this work an evaluation study comparing offline and online neural machine translation architectures. Two sequence-to-sequence models: convolutional Pervasive Attention (Elbayad et al. 2018) and attention-based Transformer (Vaswani et al. 2017) are considered. We investigate, for both architectures, the impact of online decoding constraints on the translation quality through a carefully designed human evaluation on English-German and German-English language pairs, the latter being particularly sensitive to latency constraints. The evaluation results allow us to identify the strengths and shortcomings of each model when we shift to the online setup.

READ FULL TEXT

page 7

page 8

11/13/2020

EDITOR: an Edit-Based Transformer with Repositioning for Neural Machine Translation with Soft Lexical Constraints

We introduce an Edit-Based Transformer with Repositioning (EDITOR), whic...
10/09/2020

Recursive Top-Down Production for Sentence Generation with Latent Trees

We model the recursive production property of context-free grammars for ...
02/16/2020

Neural Machine Translation with Joint Representation

Though early successes of Statistical Machine Translation (SMT) systems ...
08/05/2017

A Comparison of Neural Models for Word Ordering

We compare several language models for the word-ordering task and propos...
04/26/2018

The Best of Both Worlds: Combining Recent Advances in Neural Machine Translation

The past year has witnessed rapid advances in sequence-to-sequence (seq2...
05/12/2020

Reassessing Claims of Human Parity and Super-Human Performance in Machine Translation at WMT 2019

We reassess the claims of human parity and super-human performance made ...
01/30/2019

The Evolved Transformer

Recent works have highlighted the strengths of the Transformer architect...