Focused Concatenation for Context-Aware Neural Machine Translation

10/24/2022
by   Lorenzo Lupo, et al.
2

A straightforward approach to context-aware neural machine translation consists in feeding the standard encoder-decoder architecture with a window of consecutive sentences, formed by the current sentence and a number of sentences from its context concatenated to it. In this work, we propose an improved concatenation approach that encourages the model to focus on the translation of the current sentence, discounting the loss generated by target context. We also propose an additional improvement that strengthen the notion of sentence boundaries and of relative sentence distance, facilitating model compliance to the context-discounted objective. We evaluate our approach with both average-translation quality metrics and contrastive test sets for the translation of inter-sentential discourse phenomena, proving its superiority to the vanilla concatenation approach and other sophisticated context-aware systems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/02/2019

Improving Context-aware Neural Machine Translation with Target-side Context

In recent years, several studies on neural machine translation (NMT) hav...
research
02/13/2023

Encoding Sentence Position in Context-Aware Neural Machine Translation with Concatenation

Context-aware translation can be achieved by processing a concatenation ...
research
10/04/2018

A Large-Scale Test Set for the Evaluation of Context-Aware Pronoun Translation in Neural Machine Translation

The translation of pronouns presents a special challenge to machine tran...
research
11/01/2017

Evaluating Discourse Phenomena in Neural Machine Translation

For machine translation to tackle discourse phenomena, models must have ...
research
05/25/2018

Context-Aware Neural Machine Translation Learns Anaphora Resolution

Standard machine translation systems process sentences in isolation and ...
research
08/16/2019

Bidirectional Context-Aware Hierarchical Attention Network for Document Understanding

The Hierarchical Attention Network (HAN) has made great strides, but it ...
research
07/14/2021

Surgical Instruction Generation with Transformers

Automatic surgical instruction generation is a prerequisite towards intr...

Please sign up or login with your details

Forgot password? Click here to reset