DeepAI AI Chat
Log In Sign Up

What Level of Quality can Neural Machine Translation Attain on Literary Text?

01/15/2018
by   Antonio Toral, et al.
University of Groningen
ADAPT Centre
0

Given the rise of a new approach to MT, Neural MT (NMT), and its promising performance on different text types, we assess the translation quality it can attain on what is perceived to be the greatest challenge for MT: literary text. Specifically, we target novels, arguably the most popular type of literary text. We build a literary-adapted NMT system for the English-to-Catalan translation direction and evaluate it against a system pertaining to the previous dominant paradigm in MT: statistical phrase-based MT (PBSMT). To this end, for the first time we train MT systems, both NMT and PBSMT, on large amounts of literary text (over 100 million words) and evaluate them on a set of twelve widely known novels spanning from the the 1920s to the present day. According to the BLEU automatic evaluation metric, NMT is significantly better than PBSMT (p < 0.01) on all the novels considered. Overall, NMT results in a 11 evaluation on three of the books shows that between 17 translations, depending on the book, produced by NMT (versus 8 PBSMT) are perceived by native speakers of the target language to be of equivalent quality to translations produced by a professional human translator.

READ FULL TEXT

page 19

page 20

09/10/2021

Neural Machine Translation Quality and Post-Editing Performance

We test the natural expectation that using MT in professional translatio...
11/30/2020

Machine Translation of Novels in the Age of Transformer

In this chapter we build a machine translation (MT) system tailored to t...
11/09/2022

HilMeMe: A Human-in-the-Loop Machine Translation Evaluation Metric Looking into Multi-Word Expressions

With the fast development of Machine Translation (MT) systems, especiall...
03/15/2022

Can Synthetic Translations Improve Bitext Quality?

Synthetic translations have been used for a wide range of NLP tasks prim...
02/05/2021

Understanding Pre-Editing for Black-Box Neural Machine Translation

Pre-editing is the process of modifying the source text (ST) so that it ...
05/09/2022

CoCoA-MT: A Dataset and Benchmark for Contrastive Controlled MT with Application to Formality

The machine translation (MT) task is typically formulated as that of ret...
12/16/2021

Amortized Noisy Channel Neural Machine Translation

Noisy channel models have been especially effective in neural machine tr...