Divide and Rule: Training Context-Aware Multi-Encoder Translation Models with Little Resources

03/31/2021
by   Lorenzo Lupo, et al.
0

Multi-encoder models are a broad family of context-aware Neural Machine Translation (NMT) systems that aim to improve translation quality by encoding document-level contextual information alongside the current sentence. The context encoding is undertaken by contextual parameters, trained on document-level data. In this work, we show that training these parameters takes large amount of data, since the contextual training signal is sparse. We propose an efficient alternative, based on splitting sentence pairs, that allows to enrich the training signal of a set of parallel sentences by breaking intra-sentential syntactic links, and thus frequently pushing the model to search the context for disambiguating clues. We evaluate our approach with BLEU and contrastive test sets, showing that it allows multi-encoder models to achieve comparable performances to a setting where they are trained with ×10 document-level data. We also show that our approach is a viable option to context-aware NMT for language pairs with zero document-level parallel data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/24/2020

Context-aware Decoder for Neural Machine Translation using a Target-side Document-Level Language Model

Although many context-aware neural machine translation models have been ...
research
09/13/2021

Contrastive Learning for Context-aware Neural Machine TranslationUsing Coreference Information

Context-aware neural machine translation (NMT) incorporates contextual i...
research
09/05/2018

Document-Level Neural Machine Translation with Hierarchical Attention Networks

Neural Machine Translation (NMT) can be improved by including document-l...
research
05/15/2019

When a Good Translation is Wrong in Context: Context-Aware Machine Translation Improves on Deixis, Ellipsis, and Lexical Cohesion

Though machine translation errors caused by the lack of context beyond o...
research
08/16/2019

Bidirectional Context-Aware Hierarchical Attention Network for Document Understanding

The Hierarchical Attention Network (HAN) has made great strides, but it ...
research
05/07/2020

Does Multi-Encoder Help? A Case Study on Context-Aware Neural Machine Translation

In encoder-decoder neural models, multiple encoders are in general used ...
research
07/23/2021

Modeling Bilingual Conversational Characteristics for Neural Chat Translation

Neural chat translation aims to translate bilingual conversational text,...

Please sign up or login with your details

Forgot password? Click here to reset