Geometry of Compositionality

11/29/2016
by   Hongyu Gong, et al.
0

This paper proposes a simple test for compositionality (i.e., literal usage) of a word or phrase in a context-specific way. The test is computationally simple, relying on no external resources and only uses a set of trained word vectors. Experiments show that the proposed method is competitive with state of the art and displays high accuracy in context-specific compositionality detection of a variety of natural language phenomena (idiomaticity, sarcasm, metaphor) for different datasets in multiple languages. The key insight is to connect compositionality to a curious geometric property of word embeddings, which is of independent interest.

READ FULL TEXT
research
03/24/2016

Part-of-Speech Relevance Weights for Learning Word Embeddings

This paper proposes a model to learn word embeddings with weighted conte...
research
11/15/2017

Unsupervised Morphological Expansion of Small Datasets for Improving Word Embeddings

We present a language independent, unsupervised method for building word...
research
10/06/2017

Learning Word Embeddings for Hyponymy with Entailment-Based Distributional Semantics

Lexical entailment, such as hyponymy, is a fundamental issue in the sema...
research
04/30/2020

Analyzing the Surprising Variability in Word Embedding Stability Across Languages

Word embeddings are powerful representations that form the foundation of...
research
04/21/2018

Context-Attentive Embeddings for Improved Sentence Representations

While one of the first steps in many NLP systems is selecting what embed...
research
05/08/2017

Ontology-Aware Token Embeddings for Prepositional Phrase Attachment

Type-level word embeddings use the same set of parameters to represent a...
research
02/26/2019

Context Vectors are Reflections of Word Vectors in Half the Dimensions

This paper takes a step towards theoretical analysis of the relationship...

Please sign up or login with your details

Forgot password? Click here to reset