A Preordered RNN Layer Boosts Neural Machine Translation in Low Resource Settings

12/28/2021
by   Mohaddeseh Bastan, et al.
0

Neural Machine Translation (NMT) models are strong enough to convey semantic and syntactic information from the source language to the target language. However, these models are suffering from the need for a large amount of data to learn the parameters. As a result, for languages with scarce data, these models are at risk of underperforming. We propose to augment attention based neural network with reordering information to alleviate the lack of data. This augmentation improves the translation quality for both English to Persian and Persian to English by up to 6

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset