Fine-grained Human Evaluation of Transformer and Recurrent Approaches to Neural Machine Translation for English-to-Chinese

06/15/2020
by   Yuying Ye, et al.
0

This research presents a fine-grained human evaluation to compare the Transformer and recurrent approaches to neural machine translation (MT), on the translation direction English-to-Chinese. To this end, we develop an error taxonomy compliant with the Multidimensional Quality Metrics (MQM) framework that is customised to the relevant phenomena of this translation direction. We then conduct an error annotation using this customised error taxonomy on the output of state-of-the-art recurrent- and Transformer-based MT systems on a subset of WMT2019's news test set. The resulting annotation shows that, compared to the best recurrent system, the best Transformer system results in a 31 errors in 10 out of 22 error categories. We also note that two of the systems evaluated do not produce any error for a category that was relevant for this translation direction prior to the advent of NMT systems: Chinese classifiers.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/02/2018

Quantitative Fine-Grained Human Evaluation of Machine Translation Systems: a Case Study on English to Croatian

This paper presents a quantitative fine-grained manual evaluation approa...
research
11/21/2017

Evaluating Machine Translation Performance on Chinese Idioms with a Blacklist Method

Idiom translation is a challenging problem in machine translation becaus...
research
05/04/2018

Upping the Ante: Towards a Better Benchmark for Chinese-to-English Machine Translation

There are many machine translation (MT) papers that propose novel approa...
research
05/20/2022

SALTED: A Framework for SAlient Long-Tail Translation Error Detection

Traditional machine translation (MT) metrics provide an average measure ...
research
05/26/2023

TranSFormer: Slow-Fast Transformer for Machine Translation

Learning multiscale Transformer models has been evidenced as a viable ap...
research
11/30/2020

Machine Translation of Novels in the Age of Transformer

In this chapter we build a machine translation (MT) system tailored to t...
research
09/15/2019

Automatically Extracting Challenge Sets for Non-local Phenomena Neural Machine Translation

We show that the state-of-the-art Transformer MT model is not biased tow...

Please sign up or login with your details

Forgot password? Click here to reset