Learning Word Embeddings for Hyponymy with Entailment-Based Distributional Semantics

10/06/2017
by   James Henderson, et al.
0

Lexical entailment, such as hyponymy, is a fundamental issue in the semantics of natural language. This paper proposes distributional semantic models which efficiently learn word embeddings for entailment, using a recently-proposed framework for modelling entailment in a vector-space. These models postulate a latent vector for a pseudo-phrase containing two neighbouring word vectors. We investigate both modelling words as the evidence they contribute about this phrase vector, or as the posterior distribution of a one-word phrase vector, and find that the posterior vectors perform better. The resulting word embeddings outperform the best previous results on predicting hyponymy between words, in unsupervised and semi-supervised experiments.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/13/2016

A Vector Space for Distributional Semantics for Entailment

Distributional semantics creates vector-space representations that captu...
research
01/01/2021

Key Phrase Extraction Applause Prediction

With the increase in content availability over the internet it is very d...
research
07/23/2017

Hierarchical Embeddings for Hypernymy Detection and Directionality

We present a novel neural model HyperVec to learn hierarchical embedding...
research
10/02/2017

Unsupervised Hypernym Detection by Distributional Inclusion Vector Embedding

Modeling hypernymy, such as poodle is-a dog, is an important generalizat...
research
11/29/2016

Geometry of Compositionality

This paper proposes a simple test for compositionality (i.e., literal us...
research
05/25/2016

Integrating Distributional Lexical Contrast into Word Embeddings for Antonym-Synonym Distinction

We propose a novel vector representation that integrates lexical contras...
research
09/09/2017

Semi-Supervised Instance Population of an Ontology using Word Vector Embeddings

In many modern day systems such as information extraction and knowledge ...

Please sign up or login with your details

Forgot password? Click here to reset