MTCue: Learning Zero-Shot Control of Extra-Textual Attributes by Leveraging Unstructured Context in Neural Machine Translation

05/25/2023
by   Sebastian Vincent, et al.
0

Efficient utilisation of both intra- and extra-textual context remains one of the critical gaps between machine and human translation. Existing research has primarily focused on providing individual, well-defined types of context in translation, such as the surrounding text or discrete external variables like the speaker's gender. This work introduces MTCue, a novel neural machine translation (NMT) framework that interprets all context (including discrete variables) as text. MTCue learns an abstract representation of context, enabling transferability across different data settings and leveraging similar attributes in low-resource scenarios. With a focus on a dialogue domain with access to document and metadata context, we extensively evaluate MTCue in four language pairs in both translation directions. Our framework demonstrates significant improvements in translation quality over a parameter-matched non-contextual baseline, as measured by BLEU (+0.88) and Comet (+1.58). Moreover, MTCue significantly outperforms a "tagging" baseline at translating English text. Analysis reveals that the context encoder of MTCue learns a representation space that organises context based on specific attributes, such as formality, enabling effective zero-shot control. Pre-training on context embeddings also improves MTCue's few-shot performance compared to the "tagging" baseline. Finally, an ablation study conducted on model components and contextual variables further supports the robustness of MTCue for context-based NMT.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/10/2021

Self-Learning for Zero Shot Neural Machine Translation

Neural Machine Translation (NMT) approaches employing monolingual data a...
research
06/04/2019

Improved Zero-shot Neural Machine Translation via Ignoring Spurious Correlations

Zero-shot translation, translating between language pairs on which a Neu...
research
12/03/2019

Cross-lingual Pre-training Based Transfer for Zero-shot Neural Machine Translation

Transfer learning between different language pairs has shown its effecti...
research
05/10/2022

Controlling Extra-Textual Attributes about Dialogue Participants: A Case Study of English-to-Polish Neural Machine Translation

Unlike English, morphologically rich languages can reveal characteristic...
research
02/11/2021

Towards Personalised and Document-level Machine Translation of Dialogue

State-of-the-art (SOTA) neural machine translation (NMT) systems transla...
research
08/26/2018

Contextual Parameter Generation for Universal Neural Machine Translation

We propose a simple modification to existing neural machine translation ...
research
04/21/2020

Contextual Neural Machine Translation Improves Translation of Cataphoric Pronouns

The advent of context-aware NMT has resulted in promising improvements i...

Please sign up or login with your details

Forgot password? Click here to reset