ConsistTL: Modeling Consistency in Transfer Learning for Low-Resource Neural Machine Translation

12/08/2022
by   Zhaocong Li, et al.
0

Transfer learning is a simple and powerful method that can be used to boost model performance of low-resource neural machine translation (NMT). Existing transfer learning methods for NMT are static, which simply transfer knowledge from a parent model to a child model once via parameter initialization. In this paper, we propose a novel transfer learning method for NMT, namely ConsistTL, which can continuously transfer knowledge from the parent model during the training of the child model. Specifically, for each training instance of the child model, ConsistTL constructs the semantically-equivalent instance for the parent model and encourages prediction consistency between the parent and child for this instance, which is equivalent to the child model learning each instance under the guidance of the parent model. Experimental results on five low-resource NMT tasks demonstrate that ConsistTL results in significant improvements over strong transfer learning baselines, with a gain up to 1.7 BLEU over the existing back-translation model on the widely-used WMT17 Turkish-English benchmark. Further analysis reveals that ConsistTL can improve the inference calibration of the child model. Code and scripts are freely available at https://github.com/NLP2CT/ConsistTL.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/08/2016

Transfer Learning for Low-Resource Neural Machine Translation

The encoder-decoder framework for neural machine translation (NMT) has b...
research
07/06/2019

Exploiting Out-of-Domain Parallel Data through Multilingual Transfer Learning for Low-Resource Neural Machine Translation

This paper proposes a novel multilingual multistage fine-tuning approach...
research
03/03/2021

Meta-Curriculum Learning for Domain Adaptation in Neural Machine Translation

Meta-learning has been sufficiently validated to be beneficial for low-r...
research
08/08/2021

Machine Translation of Low-Resource Indo-European Languages

Transfer learning has been an important technique for low-resource neura...
research
08/25/2018

Meta-Learning for Low-Resource Neural Machine Translation

In this paper, we propose to extend the recently introduced model-agnost...
research
05/09/2022

Sub-Word Alignment Is Still Useful: A Vest-Pocket Method for Enhancing Low-Resource Machine Translation

We leverage embedding duplication between aligned sub-words to extend th...
research
01/06/2020

Exploring Benefits of Transfer Learning in Neural Machine Translation

Neural machine translation is known to require large numbers of parallel...

Please sign up or login with your details

Forgot password? Click here to reset