UM4: Unified Multilingual Multiple Teacher-Student Model for Zero-Resource Neural Machine Translation

07/11/2022
by   Jian Yang, et al.
0

Most translation tasks among languages belong to the zero-resource translation problem where parallel corpora are unavailable. Multilingual neural machine translation (MNMT) enables one-pass translation using shared semantic space for all languages compared to the two-pass pivot translation but often underperforms the pivot-based method. In this paper, we propose a novel method, named as Unified Multilingual Multiple teacher-student Model for NMT (UM4). Our method unifies source-teacher, target-teacher, and pivot-teacher models to guide the student model for the zero-resource translation. The source teacher and target teacher force the student to learn the direct source to target translation by the distilled knowledge on both source and target sides. The monolingual corpus is further leveraged by the pivot-teacher model to enhance the student model. Experimental results demonstrate that our model of 72 directions significantly outperforms previous methods on the WMT benchmark.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/02/2017

A Teacher-Student Framework for Zero-Resource Neural Machine Translation

While end-to-end neural machine translation (NMT) has made remarkable pr...
research
12/31/2020

Exploring Monolingual Data for Neural Machine Translation with Knowledge Distillation

We explore two types of monolingual data that can be included in knowled...
research
04/08/2020

Structure-Level Knowledge Distillation For Multilingual Sequence Labeling

Multilingual sequence labeling is a task of predicting label sequences u...
research
06/08/2018

Multilingual Neural Machine Translation with Task-Specific Attention

Multilingual machine translation addresses the task of translating betwe...
research
09/28/2022

Multilingual Transitivity and Bidirectional Multilingual Agreement for Multilingual Document-level Machine Translation

Multilingual machine translation has been proven an effective strategy t...
research
06/13/2016

Zero-Resource Translation with Multi-Lingual Neural Machine Translation

In this paper, we propose a novel finetuning algorithm for the recently ...
research
01/07/2022

Compressing Models with Few Samples: Mimicking then Replacing

Few-sample compression aims to compress a big redundant model into a sma...

Please sign up or login with your details

Forgot password? Click here to reset