Machine Translation of Low-Resource Indo-European Languages

08/08/2021
by   Wei-Rui Chen, et al.
0

Transfer learning has been an important technique for low-resource neural machine translation. In this work, we build two systems to study how relatedness can benefit the translation performance. The primary system adopts machine translation model pre-trained on related language pair and the contrastive system adopts that pre-trained on unrelated language pair. We show that relatedness is not required for transfer learning to work but does benefit the performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/02/2018

Trivial Transfer Learning for Low-Resource Neural Machine Translation

Transfer learning has been proven as an effective technique for neural m...
research
09/30/2020

On Romanization for Model Transfer Between Scripts in Neural Machine Translation

Transfer learning is a popular strategy to improve the quality of low-re...
research
09/14/2019

A Universal Parent Model for Low-Resource Neural Machine Translation Transfer

Transfer learning from a high-resource language pair `parent' has been p...
research
08/31/2017

Transfer Learning across Low-Resource, Related Languages for Neural Machine Translation

We present a simple method to improve neural translation of a low-resour...
research
01/06/2020

Exploring Benefits of Transfer Learning in Neural Machine Translation

Neural machine translation is known to require large numbers of parallel...
research
12/08/2022

ConsistTL: Modeling Consistency in Transfer Learning for Low-Resource Neural Machine Translation

Transfer learning is a simple and powerful method that can be used to bo...

Please sign up or login with your details

Forgot password? Click here to reset