Improving Zero-Shot Translation by Disentangling Positional Information

12/30/2020
by   Danni Liu, et al.
6

Multilingual neural machine translation has shown the capability of directly translating between language pairs unseen in training, i.e. zero-shot translation. Despite being conceptually attractive, it often suffers from low output quality. The difficulty of generalizing to new translation directions suggests the model representations are highly specific to those language pairs seen in training. We demonstrate that a main factor causing the language-specific representations is the positional correspondence to input tokens. We show that this can be easily alleviated by removing residual connections in an encoder layer. With this modification, we gain up to 18.5 BLEU points on zero-shot translation while retaining quality on supervised directions. The improvements are particularly prominent between related languages, where our proposed model outperforms pivot-based translation. Moreover, our approach allows easy integration of new languages, which substantially expands translation coverage. By thorough inspections of the hidden layer outputs, we show that our approach indeed leads to more language-independent representations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/17/2019

The Missing Ingredient in Zero-Shot Neural Machine Translation

Multilingual Neural Machine Translation (NMT) models are capable of tran...
research
06/20/2019

Improving Zero-shot Translation with Language-Independent Constraints

An important concern in training multilingual neural machine translation...
research
05/16/2023

Exploring the Impact of Layer Normalization for Zero-shot Neural Machine Translation

This paper studies the impact of layer normalization (LayerNorm) on zero...
research
11/04/2018

Improving Zero-Shot Translation of Low-Resource Languages

Recent work on multilingual neural machine translation reported competit...
research
09/10/2021

Rethinking Zero-shot Neural Machine Translation: From a Perspective of Latent Variables

Zero-shot translation, directly translating between language pairs unsee...
research
05/17/2023

Searching for Needles in a Haystack: On the Role of Incidental Bilingualism in PaLM's Translation Capability

Large, multilingual language models exhibit surprisingly good zero- or f...
research
10/28/2022

Improving Zero-Shot Multilingual Translation with Universal Representations and Cross-Mappings

The many-to-many multilingual neural machine translation can translate b...

Please sign up or login with your details

Forgot password? Click here to reset