Neural Machine Translation with Extended Context

08/20/2017
by   Jörg Tiedemann, et al.
0

We investigate the use of extended context in attention-based neural machine translation. We base our experiments on translated movie subtitles and discuss the effect of increasing the segments beyond single translation units. We study the use of extended source language context as well as bilingual context extensions. The models learn to distinguish between information from different segments and are surprisingly robust with respect to translation quality. In this pilot study, we observe interesting cross-sentential attention patterns that improve textual coherence in translation at least in some selected cases.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/07/2016

Can neural machine translation do simultaneous translation?

We investigate the potential of attention-based neural machine translati...
research
08/10/2017

Neural and Statistical Methods for Leveraging Meta-information in Machine Translation

In this paper, we discuss different methods which use meta information a...
research
07/05/2016

Target-Side Context for Discriminative Models in Statistical Machine Translation

Discriminative translation models utilizing source context have been sho...
research
05/25/2018

Context-Aware Neural Machine Translation Learns Anaphora Resolution

Standard machine translation systems process sentences in isolation and ...
research
11/25/2016

Neural Machine Translation with Latent Semantic of Image and Text

Although attention-based Neural Machine Translation have achieved great ...
research
02/20/2021

Understanding and Enhancing the Use of Context for Machine Translation

To understand and infer meaning in language, neural models have to learn...
research
12/19/2016

An Empirical Study of Adequate Vision Span for Attention-Based Neural Machine Translation

Recently, the attention mechanism plays a key role to achieve high perfo...

Please sign up or login with your details

Forgot password? Click here to reset