Zero-Shot Translation using Diffusion Models

11/02/2021
by   Eliya Nachmani, et al.
0

In this work, we show a novel method for neural machine translation (NMT), using a denoising diffusion probabilistic model (DDPM), adjusted for textual data, following recent advances in the field. We show that it's possible to translate sentences non-autoregressively using a diffusion model conditioned on the source sentence. We also show that our model is able to translate between pairs of languages unseen during training (zero-shot learning).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/20/2019

Improving Zero-shot Translation with Language-Independent Constraints

An important concern in training multilingual neural machine translation...
research
05/12/2023

Improving Zero-shot Multilingual Neural Machine Translation by Leveraging Cross-lingual Consistency Regularization

The multilingual neural machine translation (NMT) model has a promising ...
research
06/04/2019

Improved Zero-shot Neural Machine Translation via Ignoring Spurious Correlations

Zero-shot translation, translating between language pairs on which a Neu...
research
06/11/2021

Towards User-Driven Neural Machine Translation

A good translation should not only translate the original content semant...
research
12/19/2020

On (Emergent) Systematic Generalisation and Compositionality in Visual Referential Games with Straight-Through Gumbel-Softmax Estimator

The drivers of compositionality in artificial languages that emerge when...
research
01/31/2023

Zero-shot-Learning Cross-Modality Data Translation Through Mutual Information Guided Stochastic Diffusion

Cross-modality data translation has attracted great interest in image co...
research
02/12/2021

Continuous Learning in Neural Machine Translation using Bilingual Dictionaries

While recent advances in deep learning led to significant improvements i...

Please sign up or login with your details

Forgot password? Click here to reset