Does Neural Machine Translation Benefit from Larger Context?

04/17/2017
by   Sébastien Jean, et al.
0

We propose a neural machine translation architecture that models the surrounding text in addition to the source sentence. These models lead to better performance, both in terms of general translation quality and pronoun prediction, when trained on small corpora, although this improvement largely disappears when trained with a larger corpus. We also discover that attention-based neural machine translation is well suited for pronoun prediction and compares favorably with other approaches that were specifically designed for this task.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/16/2018

Tensor2Tensor for Neural Machine Translation

Tensor2Tensor is a library for deep learning models that is well-suited ...
research
06/07/2016

Can neural machine translation do simultaneous translation?

We investigate the potential of attention-based neural machine translati...
research
02/08/2017

Neural Machine Translation with Source-Side Latent Graph Parsing

This paper presents a novel neural machine translation model which joint...
research
04/13/2018

Pieces of Eight: 8-bit Neural Machine Translation

Neural machine translation has achieved levels of fluency and adequacy t...
research
10/11/2018

Simple and Effective Text Simplification Using Semantic and Neural Methods

Sentence splitting is a major simplification operator. Here we present a...
research
10/09/2017

What does Attention in Neural Machine Translation Pay Attention to?

Attention in neural machine translation provides the possibility to enco...
research
06/25/2020

Modeling Baroque Two-Part Counterpoint with Neural Machine Translation

We propose a system for contrapuntal music generation based on a Neural ...

Please sign up or login with your details

Forgot password? Click here to reset