When and Why are Pre-trained Word Embeddings Useful for Neural Machine Translation?

04/17/2018
by   Ye Qi, et al.
0

The performance of Neural Machine Translation (NMT) systems often suffers in low-resource scenarios where sufficiently large-scale parallel corpora cannot be obtained. Pre-trained word embeddings have proven to be invaluable for improving performance in natural language analysis tasks, which often suffer from paucity of data. However, their utility for NMT has not been extensively explored. In this work, we perform five sets of experiments that analyze when we can expect pre-trained word embeddings to help in NMT tasks. We show that such embeddings can be surprisingly effective in some cases -- providing gains of up to 20 BLEU points in the most favorable setting.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/17/2018

When and Why are Pre-trainedWord Embeddings Useful for Neural Machine Translation?

The performance of Neural Machine Translation (NMT) systems often suffer...
research
06/05/2018

How Do Source-side Monolingual Word Embeddings Impact Neural Machine Translation?

Using pre-trained word embeddings as input layer is a common practice in...
research
04/04/2019

ReWE: Regressing Word Embeddings for Regularization of Neural Machine Translation Systems

Regularization of neural machine translation is still a significant prob...
research
05/24/2019

Debiasing Word Embeddings Improves Multimodal Machine Translation

In recent years, pretrained word embeddings have proved useful for multi...
research
10/14/2019

Transformers without Tears: Improving the Normalization of Self-Attention

We evaluate three simple, normalization-centric changes to improve Trans...
research
09/29/2020

Leader: Prefixing a Length for Faster Word Vector Serialization

Two competing file formats have become the de facto standards for distri...
research
10/25/2018

Learning Neural Emotion Analysis from 100 Observations: The Surprising Effectiveness of Pre-Trained Word Representations

Deep Learning has drastically reshaped virtually all areas of NLP. Yet o...

Please sign up or login with your details

Forgot password? Click here to reset