Exploiting Sentential Context for Neural Machine Translation

06/04/2019
by   Xing Wang, et al.
0

In this work, we present novel approaches to exploit sentential context for neural machine translation (NMT). Specifically, we first show that a shallow sentential context extracted from the top encoder layer only, can improve translation performance via contextualizing the encoding representations of individual words. Next, we introduce a deep sentential context, which aggregates the sentential context representations from all the internal layers of the encoder to form a more comprehensive context representation. Experimental results on the WMT14 English-to-German and English-to-French benchmarks show that our model consistently improves performance over the strong TRANSFORMER model (Vaswani et al., 2017), demonstrating the necessity and effectiveness of exploiting sentential context for NMT.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/16/2018

Towards Robust Neural Machine Translation

Small perturbations in the input can severely distort intermediate repre...
research
10/24/2018

Exploiting Deep Representations for Neural Machine Translation

Advanced neural machine translation (NMT) models generally implement enc...
research
10/06/2020

Data Rejuvenation: Exploiting Inactive Training Examples for Neural Machine Translation

Large-scale training datasets lie at the core of the recent success of n...
research
11/03/2020

Layer-Wise Multi-View Learning for Neural Machine Translation

Traditional neural machine translation is limited to the topmost encoder...
research
11/01/2017

Evaluating Discourse Phenomena in Neural Machine Translation

For machine translation to tackle discourse phenomena, models must have ...
research
11/22/2019

Neuron Interaction Based Representation Composition for Neural Machine Translation

Recent NLP studies reveal that substantial linguistic information can be...
research
01/26/2022

Learning to Recommend Method Names with Global Context

In programming, the names for the program entities, especially for the m...

Please sign up or login with your details

Forgot password? Click here to reset