Log In Sign Up

Modeling Future Cost for Neural Machine Translation

by   Chaoqun Duan, et al.

Existing neural machine translation (NMT) systems utilize sequence-to-sequence neural networks to generate target translation word by word, and then make the generated word at each time-step and the counterpart in the references as consistent as possible. However, the trained translation model tends to focus on ensuring the accuracy of the generated target word at the current time-step and does not consider its future cost which means the expected cost of generating the subsequent target translation (i.e., the next target word). To respond to this issue, we propose a simple and effective method to model the future cost of each target word for NMT systems. In detail, a time-dependent future cost is estimated based on the current generated target word and its contextual information to boost the training of the NMT model. Furthermore, the learned future context representation at the current time-step is used to help the generation of the next target word in the decoding. Experimental results on three widely-used translation datasets, including the WMT14 German-to-English, WMT14 English-to-French, and WMT17 Chinese-to-English, show that the proposed approach achieves significant improvements over strong Transformer-based NMT baseline.


page 1

page 2

page 3

page 4


Explicit Reordering for Neural Machine Translation

In Transformer-based neural machine translation (NMT), the positional en...

Future-Prediction-Based Model for Neural Machine Translation

We propose a novel model for Neural Machine Translation (NMT). Different...

Deconvolution-Based Global Decoding for Neural Machine Translation

A great proportion of sequence-to-sequence (Seq2Seq) models for Neural M...

Modeling Target-Side Inflection in Neural Machine Translation

NMT systems have problems with large vocabulary sizes. Byte-pair encodin...

Modeling Fluency and Faithfulness for Diverse Neural Machine Translation

Neural machine translation models usually adopt the teacher forcing stra...

Bridging the Gap between Training and Inference for Neural Machine Translation

Neural Machine Translation (NMT) generates target words sequentially in ...

Token Drop mechanism for Neural Machine Translation

Neural machine translation with millions of parameters is vulnerable to ...