Shared-Private Bilingual Word Embeddings for Neural Machine Translation

06/07/2019
by   Xuebo Liu, et al.
0

Word embedding is central to neural machine translation (NMT), which has attracted intensive research interest in recent years. In NMT, the source embedding plays the role of the entrance while the target embedding acts as the terminal. These layers occupy most of the model parameters for representation learning. Furthermore, they indirectly interface via a soft-attention mechanism, which makes them comparatively isolated. In this paper, we propose shared-private bilingual word embeddings, which give a closer relationship between the source and target embeddings, and which also reduce the number of model parameters. For similar source and target words, their embeddings tend to share a part of the features and they cooperatively learn these common representation units. Experiments on 5 language pairs belonging to 6 different language families and written in 5 different alphabets demonstrate that the proposed model provides a significant performance boost over the strong baselines with dramatically fewer model parameters.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/15/2017

Bridging Source and Target Word Embeddings for Neural Machine Translation

Neural machine translation systems encode a source sequence into a vecto...
research
07/18/2019

Understanding Neural Machine Translation by Simplification: The Case of Encoder-free Models

In this paper, we try to understand neural machine translation (NMT) via...
research
11/01/2018

Addressing word-order Divergence in Multilingual Neural Machine Translation for extremely Low Resource Languages

Transfer learning approaches for Neural Machine Translation (NMT) train ...
research
06/05/2018

How Do Source-side Monolingual Word Embeddings Impact Neural Machine Translation?

Using pre-trained word embeddings as input layer is a common practice in...
research
06/05/2020

Filtered Inner Product Projection for Multilingual Embedding Alignment

Due to widespread interest in machine translation and transfer learning,...
research
09/02/2017

Challenging Language-Dependent Segmentation for Arabic: An Application to Machine Translation and Part-of-Speech Tagging

Word segmentation plays a pivotal role in improving any Arabic NLP appli...
research
10/27/2019

Multitask Learning For Different Subword Segmentations In Neural Machine Translation

In Neural Machine Translation (NMT) the usage of subwords and characters...

Please sign up or login with your details

Forgot password? Click here to reset