Analyzing Structures in the Semantic Vector Space: A Framework for Decomposing Word Embeddings

12/17/2019
by   Andreas Hanselowski, et al.
0

Word embeddings are rich word representations, which in combination with deep neural networks, lead to large performance gains for many NLP tasks. However, word embeddings are represented by dense, real-valued vectors and they are therefore not directly interpretable. Thus, computational operations based on them are also not well understood. In this paper, we present an approach for analyzing structures in the semantic vector space to get a better understanding of the underlying semantic encoding principles. We present a framework for decomposing word embeddings into smaller meaningful units which we call sub-vectors. The framework opens up a wide range of possibilities analyzing phenomena in vector space semantics, as well as solving concrete NLP problems: We introduce the category completion task and show that a sub-vector based approach is superior to supervised techniques; We present a sub-vector based method for solving the word analogy task, which substantially outperforms different variants of the traditional vector-offset method.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/18/2018

SeVeN: Augmenting Word Embeddings with Unsupervised Relation Vectors

We present SeVeN (Semantic Vector Networks), a hybrid resource that enco...
research
09/30/2017

Bag-of-Vector Embeddings of Dependency Graphs for Semantic Induction

Vector-space models, from word embeddings to neural network parsers, hav...
research
07/31/2017

Skill2vec: Machine Learning Approaches for Determining the Relevant Skill from Job Description

Un-supervise learned word embeddings have seen tremendous success in num...
research
03/06/2020

Distributional semantic modeling: a revised technique to train term/word vector space models applying the ontology-related approach

We design a new technique for the distributional semantic modeling with ...
research
02/18/2017

Reproducing and learning new algebraic operations on word embeddings using genetic programming

Word-vector representations associate a high dimensional real-vector to ...
research
11/09/2020

Catch the "Tails" of BERT

Recently, contextualized word embeddings outperform static word embeddin...
research
06/08/2017

Deriving a Representative Vector for Ontology Classes with Instance Word Vector Embeddings

Selecting a representative vector for a set of vectors is a very common ...

Please sign up or login with your details

Forgot password? Click here to reset