DeepAI AI Chat
Log In Sign Up

Automatically Extracting Challenge Sets for Non local Phenomena in Neural Machine Translation

by   Leshem Choshen, et al.

We show that the state of the art Transformer Machine Translation(MT) model is not biased towards monotonic reordering (unlike previous recurrent neural network models), but that nevertheless, long-distance dependencies remain a challenge for the model. Since most dependencies are short-distance, common evaluation metrics will be little influenced by how well systems perform on them. We, therefore, propose an automatic approach for extracting challenge sets replete with long-distance dependencies, and argue that evaluation using this methodology provides a complementary perspective on system performance. To support our claim, we compile challenge sets for English-German and German-English, which are much larger than any previously released challenge set for MT. The extracted sets are large enough to allow reliable automatic evaluation, which makes the proposed approach a scalable and practical solution for evaluating MT performance on the long-tail of syntactic phenomena.


page 1

page 2

page 3

page 4


Automatically Extracting Challenge Sets for Non-local Phenomena Neural Machine Translation

We show that the state-of-the-art Transformer MT model is not biased tow...

Character-based Neural Machine Translation

Neural Machine Translation (MT) has reached state-of-the-art results. Ho...

Upping the Ante: Towards a Better Benchmark for Chinese-to-English Machine Translation

There are many machine translation (MT) papers that propose novel approa...

Linguistic Input Features Improve Neural Machine Translation

Neural machine translation has recently achieved impressive results, whi...

Fine-grained evaluation of German-English Machine Translation based on a Test Suite

We present an analysis of 16 state-of-the-art MT systems on German-Engli...

A Challenge Set Approach to Evaluating Machine Translation

Neural machine translation represents an exciting leap forward in transl...

Continuous Space Reordering Models for Phrase-based MT

Bilingual sequence models improve phrase-based translation and reorderin...