Boosting Neural Machine Translation

12/19/2016
by   Dakun Zhang, et al.
0

Training efficiency is one of the main problems for Neural Machine Translation (NMT). Deep networks need for very large data as well as many training iterations to achieve state-of-the-art performance. This results in very high computation cost, slowing down research and industrialisation. In this paper, we propose to alleviate this problem with several training methods based on data boosting and bootstrap with no modifications to the neural network. It imitates the learning process of humans, which typically spend more time when learning "difficult" concepts than easier ones. We experiment on an English-French translation task showing accuracy improvements of up to 1.63 BLEU while saving 20

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/30/2016

Supervised Attentions for Neural Machine Translation

In this paper, we improve the attention or alignment accuracy of neural ...
research
10/20/2016

Lexicons and Minimum Risk Training for Neural Machine Translation: NAIST-CMU at WAT2016

This year, the Nara Institute of Science and Technology (NAIST)/Carnegie...
research
08/02/2017

Dynamic Data Selection for Neural Machine Translation

Intelligent selection of training data has proven a successful technique...
research
11/13/2018

Towards Neural Machine Translation for African Languages

Given that South African education is in crisis, strategies for improvem...
research
08/14/2023

SOTASTREAM: A Streaming Approach to Machine Translation Training

Many machine translation toolkits make use of a data preparation step wh...
research
11/11/2019

Diversity by Phonetics and its Application in Neural Machine Translation

We introduce a powerful approach for Neural Machine Translation (NMT), w...
research
02/23/2019

Augmenting Neural Machine Translation with Knowledge Graphs

While neural networks have been used extensively to make substantial pro...

Please sign up or login with your details

Forgot password? Click here to reset