(Self-Attentive) Autoencoder-based Universal Language Representation for Machine Translation

10/15/2018
by   Carlos Escolano, et al.
0

Universal language representation is the holy grail in machine translation (MT). Thanks to the new neural MT approach, it seems that there are good perspectives towards this goal. In this paper, we propose a new architecture based on combining variational autoencoders with encoder-decoders and introducing an interlingual loss as an additional training objective. By adding and forcing this interlingual loss, we are able to train multiple encoders and decoders for each language, sharing a common universal representation. Since the final objective of this universal representation is producing close results for similar input sentences (in any language), we propose to evaluate it by encoding the same sentence in two different languages, decoding both latent representations into the same language and comparing both outputs. Preliminary results on the WMT 2017 Turkish/English task shows that the proposed architecture is capable of learning a universal language representation and simultaneously training both translation directions with state-of-the-art results.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/15/2019

Towards Interlingua Neural Machine Translation

A common intermediate language representation or an interlingua is the h...
research
05/29/2020

Training Multilingual Machine Translation by Alternately Freezing Language-Specific Encoders-Decoders

We propose a modular architecture of language-specific encoder-decoders ...
research
02/15/2018

Universal Neural Machine Translation for Extremely Low Resource Languages

In this paper, we propose a new universal machine translation approach f...
research
10/07/2020

Pre-training Multilingual Neural Machine Translation by Leveraging Alignment Information

We investigate the following question for machine translation (MT): can ...
research
04/26/2022

Flow-Adapter Architecture for Unsupervised Machine Translation

In this work, we propose a flow-adapter architecture for unsupervised NM...
research
08/11/2020

On Learning Language-Invariant Representations for Universal Machine Translation

The goal of universal machine translation is to learn to translate betwe...
research
06/15/2016

A Correlational Encoder Decoder Architecture for Pivot Based Sequence Generation

Interlingua based Machine Translation (MT) aims to encode multiple langu...

Please sign up or login with your details

Forgot password? Click here to reset