A Preordered RNN Layer Boosts Neural Machine Translation in Low Resource Settings

12/28/2021
by   Mohaddeseh Bastan, et al.
0

Neural Machine Translation (NMT) models are strong enough to convey semantic and syntactic information from the source language to the target language. However, these models are suffering from the need for a large amount of data to learn the parameters. As a result, for languages with scarce data, these models are at risk of underperforming. We propose to augment attention based neural network with reordering information to alleviate the lack of data. This augmentation improves the translation quality for both English to Persian and Persian to English by up to 6

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/30/2019

Handling Syntactic Divergence in Low-resource Machine Translation

Despite impressive empirical successes of neural machine translation (NM...
research
01/26/2018

Context Models for OOV Word Translation in Low-Resource Languages

Out-of-vocabulary word translation is a major problem for the translatio...
research
05/18/2022

Data Augmentation to Address Out-of-Vocabulary Problem in Low-Resource Sinhala-English Neural Machine Translation

Out-of-Vocabulary (OOV) is a problem for Neural Machine Translation (NMT...
research
04/21/2021

Should we Stop Training More Monolingual Models, and Simply Use Machine Translation Instead?

Most work in NLP makes the assumption that it is desirable to develop so...
research
12/30/2020

Synthetic Source Language Augmentation for Colloquial Neural Machine Translation

Neural machine translation (NMT) is typically domain-dependent and style...
research
06/07/2021

Lexicon Learning for Few-Shot Neural Sequence Modeling

Sequence-to-sequence transduction is the core problem in language proces...
research
02/03/2017

Predicting Target Language CCG Supertags Improves Neural Machine Translation

Neural machine translation (NMT) models are able to partially learn synt...

Please sign up or login with your details

Forgot password? Click here to reset