Bi-Directional Differentiable Input Reconstruction for Low-Resource Neural Machine Translation

11/02/2018
by   Xing Niu, et al.
0

We aim to better exploit the limited amounts of parallel text available in low-resource settings by introducing a differentiable reconstruction loss for neural machine translation (NMT). We reconstruct the input from sampled translations and leverage differentiable sampling and bi-directional NMT to build a compact model that can be trained end-to-end. This approach achieves small but consistent BLEU improvements on four language pairs in both translation directions, and outperforms an alternative differentiable reconstruction strategy based on hidden states.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/18/2017

Neural machine translation for low-resource languages

Neural machine translation (NMT) approaches have improved the state of t...
research
08/25/2023

Ngambay-French Neural Machine Translation (sba-Fr)

In Africa, and the world at large, there is an increasing focus on devel...
research
05/29/2018

Bi-Directional Neural Machine Translation with Synthetic Parallel Data

Despite impressive progress in high-resource settings, Neural Machine Tr...
research
11/12/2021

BitextEdit: Automatic Bitext Editing for Improved Low-Resource Machine Translation

Mined bitexts can contain imperfect translations that yield unreliable t...
research
07/24/2023

Joint Dropout: Improving Generalizability in Low-Resource Neural Machine Translation through Phrase Pair Variables

Despite the tremendous success of Neural Machine Translation (NMT), its ...
research
05/13/2018

Triangular Architecture for Rare Language Translation

Neural Machine Translation (NMT) performs poor on the low-resource langu...
research
06/06/2022

Bi-SimCut: A Simple Strategy for Boosting Neural Machine Translation

We introduce Bi-SimCut: a simple but effective training strategy to boos...

Please sign up or login with your details

Forgot password? Click here to reset