Bridging Source and Target Word Embeddings for Neural Machine Translation

11/15/2017
by   Shaohui Kuang, et al.
0

Neural machine translation systems encode a source sequence into a vector from which a target sequence is generated via a decoder. Different from the traditional statistical machine translation, source and target words are not directly mapped to each other in translation rules. They are at the two ends of a long information channel in the encoder-decoder neural network, separated by source and target hidden states. This may lead to translations with inconceivable word alignments. In this paper, we try to bridge source and target word embeddings so as to shorten their distance. We propose three strategies to bridge them: 1) a source state bridging model that moves source word embeddings one step closer to their counterparts, 2) a target state bridging model that explores relevant source word embeddings for target state prediction, and 3) a direct link bridging model that directly connects source and target word embeddings so as to minimize their discrepancy. Experiments and analysis demonstrate that the proposed bridging models are able to significantly improve quality of both translation and word alignments.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/07/2019

Shared-Private Bilingual Word Embeddings for Neural Machine Translation

Word embedding is central to neural machine translation (NMT), which has...
research
12/23/2018

Non-Autoregressive Neural Machine Translation with Enhanced Decoder Input

Non-autoregressive translation (NAT) models, which remove the dependence...
research
07/03/2016

Context-Dependent Word Representation for Neural Machine Translation

We first observe a potential weakness of continuous vector representatio...
research
09/02/2017

Challenging Language-Dependent Segmentation for Arabic: An Application to Machine Translation and Part-of-Speech Tagging

Word segmentation plays a pivotal role in improving any Arabic NLP appli...
research
05/25/2018

Recursive Neural Network Based Preordering for English-to-Japanese Machine Translation

The word order between source and target languages significantly influen...
research
03/24/2018

Near-lossless Binarization of Word Embeddings

Is it possible to learn binary word embeddings of arbitrary size from th...
research
08/31/2018

Beyond Weight Tying: Learning Joint Input-Output Embeddings for Neural Machine Translation

Tying the weights of the target word embeddings with the target word cla...

Please sign up or login with your details

Forgot password? Click here to reset