Variational Recurrent Neural Machine Translation

01/16/2018
by   Jinsong Su, et al.
0

Partially inspired by successful applications of variational recurrent neural networks, we propose a novel variational recurrent neural machine translation (VRNMT) model in this paper. Different from the variational NMT, VRNMT introduces a series of latent random variables to model the translation procedure of a sentence in a generative way, instead of a single latent variable. Specifically, the latent random variables are included into the hidden states of the NMT decoder with elements from the variational autoencoder. In this way, these variables are recurrently generated, which enables them to further capture strong and complex dependencies among the output translations at different timesteps. In order to deal with the challenges in performing efficient posterior inference and large-scale training during the incorporation of latent variables, we build a neural posterior approximator, and equip it with a reparameterization technique to estimate the variational lower bound. Experiments on Chinese-English and English-German translation tasks demonstrate that the proposed model achieves significant improvements over both the conventional and variational NMT models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/25/2016

Variational Neural Machine Translation

Models of neural machine translation are often from a discriminative fam...
research
05/28/2020

Variational Neural Machine Translation with Normalizing Flows

Variational Neural Machine Translation (VNMT) is an attractive framework...
research
09/19/2019

Improved Variational Neural Machine Translation by Promoting Mutual Information

Posterior collapse plagues VAEs for text, especially for conditional tex...
research
10/16/2020

Generating Diverse Translation from Model Distribution with Dropout

Despite the improvement of translation quality, neural machine translati...
research
08/10/2021

Regularized Sequential Latent Variable Models with Adversarial Neural Networks

The recurrent neural networks (RNN) with richly distributed internal sta...
research
05/28/2018

A Stochastic Decoder for Neural Machine Translation

The process of translation is ambiguous, in that there are typically man...
research
08/30/2019

Latent Part-of-Speech Sequences for Neural Machine Translation

Learning target side syntactic structure has been shown to improve Neura...

Please sign up or login with your details

Forgot password? Click here to reset