Zero-Shot Language Transfer vs Iterative Back Translation for Unsupervised Machine Translation

03/31/2021
by   Aviral Joshi, et al.
0

This work focuses on comparing different solutions for machine translation on low resource language pairs, namely, with zero-shot transfer learning and unsupervised machine translation. We discuss how the data size affects the performance of both unsupervised MT and transfer learning. Additionally we also look at how the domain of the data affects the result of unsupervised MT. The code to all the experiments performed in this project are accessible on Github.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/30/2021

What Can Unsupervised Machine Translation Contribute to High-Resource Language Pairs?

Whereas existing literature on unsupervised machine translation (MT) foc...
research
02/22/2022

RuCLIP – new models and experiments: a technical report

In the report we propose six new implementations of ruCLIP model trained...
research
11/14/2022

Findings of the Covid-19 MLIA Machine Translation Task

This work presents the results of the machine translation (MT) task from...
research
07/31/2022

Mismatching-Aware Unsupervised Translation Quality Estimation For Low-Resource Languages

Translation Quality Estimation (QE) is the task of predicting the qualit...
research
04/18/2021

On the Strengths of Cross-Attention in Pretrained Transformers for Machine Translation

We study the power of cross-attention in the Transformer architecture wi...
research
06/01/2023

Improving Polish to English Neural Machine Translation with Transfer Learning: Effects of Data Volume and Language Similarity

This paper investigates the impact of data volume and the use of similar...
research
09/28/2022

From Zero to Production: Baltic-Ukrainian Machine Translation Systems to Aid Refugees

In this paper, we examine the development and usage of six low-resource ...

Please sign up or login with your details

Forgot password? Click here to reset