Future-Prediction-Based Model for Neural Machine Translation

09/02/2018
by   Bingzhen Wei, et al.
0

We propose a novel model for Neural Machine Translation (NMT). Different from the conventional method, our model can predict the future text length and words at each decoding time step so that the generation can be helped with the information from the future prediction. With such information, the model does not stop generation without having translated enough content. Experimental results demonstrate that our model can significantly outperform the baseline models. Besides, our analysis reflects that our model is effective in the prediction of the length and words of the untranslated content.

READ FULL TEXT
research
02/28/2020

Modeling Future Cost for Neural Machine Translation

Existing neural machine translation (NMT) systems utilize sequence-to-se...
research
08/22/2018

Learning When to Concentrate or Divert Attention: Self-Adaptive Attention Temperature for Neural Machine Translation

Most of the Neural Machine Translation (NMT) models are based on the seq...
research
04/21/2019

Dynamic Past and Future for Neural Machine Translation

Previous studies have shown that neural machine translation (NMT) models...
research
08/13/2018

D-PAGE: Diverse Paraphrase Generation

In this paper, we investigate the diversity aspect of paraphrase generat...
research
09/30/2019

Interrogating the Explanatory Power of Attention in Neural Machine Translation

Attention models have become a crucial component in neural machine trans...
research
06/10/2018

Deconvolution-Based Global Decoding for Neural Machine Translation

A great proportion of sequence-to-sequence (Seq2Seq) models for Neural M...
research
06/01/2020

Is 42 the Answer to Everything in Subtitling-oriented Speech Translation?

Subtitling is becoming increasingly important for disseminating informat...

Please sign up or login with your details

Forgot password? Click here to reset