A Teacher-Student Framework for Zero-Resource Neural Machine Translation

05/02/2017
by   Yun Chen, et al.
0

While end-to-end neural machine translation (NMT) has made remarkable progress recently, it still suffers from the data scarcity problem for low-resource language pairs and domains. In this paper, we propose a method for zero-resource NMT by assuming that parallel sentences have close probabilities of generating a sentence in a third language. Based on this assumption, our method is able to train a source-to-target NMT model ("student") without parallel corpora available, guided by an existing pivot-to-target NMT model ("teacher") on a source-pivot parallel corpus. Experimental results show that the proposed method significantly improves over a baseline pivot-based model by +3.0 BLEU points across various language pairs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/11/2022

UM4: Unified Multilingual Multiple Teacher-Student Model for Zero-Resource Neural Machine Translation

Most translation tasks among languages belong to the zero-resource trans...
research
02/09/2018

Zero-Resource Neural Machine Translation with Multi-Agent Communication Game

While end-to-end neural machine translation (NMT) has achieved notable s...
research
04/15/2021

Simultaneous Multi-Pivot Neural Machine Translation

Parallel corpora are indispensable for training neural machine translati...
research
11/01/2018

Addressing word-order Divergence in Multilingual Neural Machine Translation for extremely Low Resource Languages

Transfer learning approaches for Neural Machine Translation (NMT) train ...
research
07/15/2020

Dual Past and Future for Neural Machine Translation

Though remarkable successes have been achieved by Neural Machine Transla...
research
07/24/2023

Joint Dropout: Improving Generalizability in Low-Resource Neural Machine Translation through Phrase Pair Variables

Despite the tremendous success of Neural Machine Translation (NMT), its ...
research
10/12/2020

Collective Wisdom: Improving Low-resource Neural Machine Translation using Adaptive Knowledge Distillation

Scarcity of parallel sentence-pairs poses a significant hurdle for train...

Please sign up or login with your details

Forgot password? Click here to reset