Examining Structure of Word Embeddings with PCA

05/31/2019
by   Tomáš Musil, et al.
0

In this paper we compare structure of Czech word embeddings for English-Czech neural machine translation (NMT), word2vec and sentiment analysis. We show that although it is possible to successfully predict part of speech (POS) tags from word embeddings of word2vec and various translation models, not all of the embedding spaces show the same structure. The information about POS is present in word2vec embeddings, but the high degree of organization by POS in the NMT decoder suggests that this information is more important for machine translation and therefore the NMT model represents it in more direct way. Our method is based on correlation of principal component analysis (PCA) dimensions with categorical linguistic data. We also show that further examining histograms of classes along the principal component is important to understand the structure of representation of information in embeddings.

READ FULL TEXT
research
05/24/2019

Debiasing Word Embeddings Improves Multimodal Machine Translation

In recent years, pretrained word embeddings have proved useful for multi...
research
05/24/2019

DebiasingWord Embeddings Improves Multimodal Machine Translation

In recent years, pretrained word embeddings have proved useful for multi...
research
07/18/2019

Understanding Neural Machine Translation by Simplification: The Case of Encoder-free Models

In this paper, we try to understand neural machine translation (NMT) via...
research
11/09/2021

A Computational Approach to Walt Whitman's Stylistic Changes in Leaves of Grass

This study analyzes Walt Whitman's stylistic changes in his phenomenal w...
research
08/30/2019

Single Training Dimension Selection for Word Embedding with PCA

In this paper, we present a fast and reliable method based on PCA to sel...
research
10/07/2019

On Leveraging the Visual Modality for Neural Machine Translation

Leveraging the visual modality effectively for Neural Machine Translatio...
research
08/31/2018

Beyond Weight Tying: Learning Joint Input-Output Embeddings for Neural Machine Translation

Tying the weights of the target word embeddings with the target word cla...

Please sign up or login with your details

Forgot password? Click here to reset