Revisiting Additive Compositionality: AND, OR and NOT Operations with Word Embeddings

05/18/2021
by   Masahiro Naito, et al.
0

It is well-known that typical word embedding methods such as Word2Vec and GloVe have the property that the meaning can be composed by adding up the embeddings (additive compositionality). Several theories have been proposed to explain additive compositionality, but the following questions remain unanswered: (Q1) The assumptions of those theories do not hold for the practical word embedding. (Q2) Ordinary additive compositionality can be seen as an AND operation of word meanings, but it is not well understood how other operations, such as OR and NOT, can be computed by the embeddings. We address these issues by the idea of frequency-weighted centering at its core. This paper proposes a post-processing method for bridging the gap between practical word embedding and the assumption of theory about additive compositionality as an answer to (Q1). It also gives a method for taking OR or NOT of the meaning by linear operation of word embedding as an answer to (Q2). Moreover, we confirm experimentally that the accuracy of AND operation, i.e., the ordinary additive compositionality, can be improved by our post-processing method (3.5x improvement in top-100 accuracy) and that OR and NOT operations can be performed correctly.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/06/2021

Human-in-the-Loop Refinement of Word Embeddings

Word embeddings are a fixed, distributional representation of the contex...
research
05/13/2023

Frequency-aware Dimension Selection for Static Word Embedding by Mixed Product Distance

Static word embedding is still useful, particularly for context-unavaila...
research
09/30/2020

Interactive Re-Fitting as a Technique for Improving Word Embeddings

Word embeddings are a fixed, distributional representation of the contex...
research
04/05/2017

Linear Ensembles of Word Embedding Models

This paper explores linear methods for combining several word embedding ...
research
01/23/2023

Breaking the Boundaries of Knowledge Space: Analyzing the Knowledge Spanning on the Q A Website through Word Embeddings

The challenge of raising a creative question exists in recombining diffe...
research
07/22/2021

Theoretical foundations and limits of word embeddings: what types of meaning can they capture?

Measuring meaning is a central problem in cultural sociology and word em...
research
05/24/2018

Baseline Needs More Love: On Simple Word-Embedding-Based Models and Associated Pooling Mechanisms

Many deep learning architectures have been proposed to model the composi...

Please sign up or login with your details

Forgot password? Click here to reset