A Comparison of Approaches to Document-level Machine Translation

01/26/2021
by   Zhiyi Ma, et al.
0

Document-level machine translation conditions on surrounding sentences to produce coherent translations. There has been much recent work in this area with the introduction of custom model architectures and decoding algorithms. This paper presents a systematic comparison of selected approaches from the literature on two benchmarks for which document-level phenomena evaluation suites exist. We find that a simple method based purely on back-translating monolingual document-level data performs as well as much more elaborate alternatives, both in terms of document-level metrics as well as human evaluation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/19/2022

Discourse Cohesion Evaluation for Document-Level Neural Machine Translation

It is well known that translations generated by an excellent document-le...
research
04/21/2021

On User Interfaces for Large-Scale Document-Level Human Evaluation of Machine Translation Outputs

Recent studies emphasize the need of document context in human evaluatio...
research
06/08/2023

On Search Strategies for Document-Level Neural Machine Translation

Compared to sentence-level systems, document-level neural machine transl...
research
10/08/2020

Leveraging Discourse Rewards for Document-Level Neural Machine Translation

Document-level machine translation focuses on the translation of entire ...
research
09/03/2019

Context-Aware Monolingual Repair for Neural Machine Translation

Modern sentence-level NMT systems often produce plausible translations o...
research
10/16/2022

Modeling Context With Linear Attention for Scalable Document-Level Translation

Document-level machine translation leverages inter-sentence dependencies...
research
06/15/2020

DynE: Dynamic Ensemble Decoding for Multi-Document Summarization

Sequence-to-sequence (s2s) models are the basis for extensive work in na...

Please sign up or login with your details

Forgot password? Click here to reset