Revisiting Context Choices for Context-aware Machine Translation

09/07/2021
by   Matīss Rikters, et al.
0

One of the most popular methods for context-aware machine translation (MT) is to use separate encoders for the source sentence and context as multiple sources for one target sentence. Recent work has cast doubt on whether these models actually learn useful signals from the context or are improvements in automatic evaluation metrics just a side-effect. We show that multi-source transformer models improve MT over standard transformer-base models even with empty lines provided as context, but the translation quality improves significantly (1.51 - 2.65 BLEU) when a sufficient amount of correct context is provided. We also show that even though randomly shuffling in-domain context can also improve over baselines, the correct context further improves translation quality and random out-of-domain context further degrades it.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/07/2021

Measuring and Increasing Context Usage in Context-Aware Machine Translation

Recent work in neural machine translation has demonstrated both the nece...
research
08/15/2019

Transformer-based Automatic Post-Editing with a Context-Aware Encoding Approach for Multi-Source Inputs

Recent approaches to the Automatic Post-Editing (APE) research have show...
research
10/04/2018

A Large-Scale Test Set for the Evaluation of Context-Aware Pronoun Translation in Neural Machine Translation

The translation of pronouns presents a special challenge to machine tran...
research
01/31/2018

Paraphrase-Supervised Models of Compositionality

Compositional vector space models of meaning promise new solutions to st...
research
02/13/2023

Encoding Sentence Position in Context-Aware Neural Machine Translation with Concatenation

Context-aware translation can be achieved by processing a concatenation ...
research
05/05/2023

In-context Learning as Maintaining Coherency: A Study of On-the-fly Machine Translation Using Large Language Models

The phenomena of in-context learning has typically been thought of as "l...

Please sign up or login with your details

Forgot password? Click here to reset