Multilingual Neural Machine Translation with Task-Specific Attention

06/08/2018
by   Graeme Blackwood, et al.
0

Multilingual machine translation addresses the task of translating between multiple source and target languages. We propose task-specific attention models, a simple but effective technique for improving the quality of sequence-to-sequence neural multilingual translation. Our approach seeks to retain as much of the parameter sharing generalization of NMT models as possible, while still allowing for language-specific specialization of the attention model to a particular language-pair or task. Our experiments on four languages of the Europarl corpus show that using a target-specific model of attention provides consistent gains in translation quality for all possible translation directions, compared to a model in which all parameters are shared. We observe improved translation quality even in the (extreme) low-resource zero-shot translation directions for which the model never saw explicitly paired parallel data.

READ FULL TEXT
research
03/17/2019

The Missing Ingredient in Zero-Shot Neural Machine Translation

Multilingual Neural Machine Translation (NMT) models are capable of tran...
research
05/31/2021

Do Multilingual Neural Machine Translation Models Contain Language Pair Specific Attention Heads?

Recent studies on the analysis of the multilingual representations focus...
research
12/27/2021

Parameter Differentiation based Multilingual Neural Machine Translation

Multilingual neural machine translation (MNMT) aims to translate multipl...
research
04/30/2020

Automatic Machine Translation Evaluation in Many Languages via Zero-Shot Paraphrasing

We propose the use of a sequence-to-sequence paraphraser for automatic m...
research
06/29/2019

Empirical Evaluation of Sequence-to-Sequence Models for Word Discovery in Low-resource Settings

Since Bahdanau et al. [1] first introduced attention for neural machine ...
research
06/02/2023

Multilingual Conceptual Coverage in Text-to-Image Models

We propose "Conceptual Coverage Across Languages" (CoCo-CroLa), a techni...
research
07/11/2022

UM4: Unified Multilingual Multiple Teacher-Student Model for Zero-Resource Neural Machine Translation

Most translation tasks among languages belong to the zero-resource trans...

Please sign up or login with your details

Forgot password? Click here to reset