Meta-Learning for Low-Resource Neural Machine Translation

08/25/2018
by   Jiatao Gu, et al.
0

In this paper, we propose to extend the recently introduced model-agnostic meta-learning algorithm (MAML) for low-resource neural machine translation (NMT). We frame low-resource translation as a meta-learning problem, and we learn to adapt to low-resource languages based on multilingual high-resource language tasks. We use the universal lexical representation gu2018universal to overcome the input-output mismatch across different languages. We evaluate the proposed meta-learning strategy using eighteen European languages (Bg, Cs, Da, De, El, Es, Et, Fr, Hu, It, Lt, Nl, Pl, Pt, Sk, Sl, Sv and Ru) as source tasks and five diverse languages (Ro, Lv, Fi, Tr and Ko) as target tasks. We show that the proposed approach significantly outperforms the multilingual, transfer learning based approach zoph2016transfer and enables us to train a competitive NMT system with only a fraction of training examples. For instance, the proposed approach can achieve as high as 22.04 BLEU on Romanian-English WMT'16 by seeing only 16,000 translated words ( 600 parallel sentences).

READ FULL TEXT
research
02/15/2018

Universal Neural Machine Translation for Extremely Low Resource Languages

In this paper, we propose a new universal machine translation approach f...
research
10/18/2020

Meta-Learning for Low-Resource Unsupervised Neural MachineTranslation

Unsupervised machine translation, which utilizes unpaired monolingual co...
research
07/06/2019

Exploiting Out-of-Domain Parallel Data through Multilingual Transfer Learning for Low-Resource Neural Machine Translation

This paper proposes a novel multilingual multistage fine-tuning approach...
research
03/20/2023

Towards Reliable Neural Machine Translation with Consistency-Aware Meta-Learning

Neural machine translation (NMT) has achieved remarkable success in prod...
research
10/05/2021

Sicilian Translator: A Recipe for Low-Resource NMT

With 17,000 pairs of Sicilian-English translated sentences, Arba Sicula ...
research
03/24/2023

SPEC: Summary Preference Decomposition for Low-Resource Abstractive Summarization

Neural abstractive summarization has been widely studied and achieved gr...
research
12/08/2022

ConsistTL: Modeling Consistency in Transfer Learning for Low-Resource Neural Machine Translation

Transfer learning is a simple and powerful method that can be used to bo...

Please sign up or login with your details

Forgot password? Click here to reset