Word Representations via Gaussian Embedding

12/20/2014
by   Luke Vilnis, et al.
0

Current work in lexical distributed representations maps each word to a point vector in low-dimensional space. Mapping instead to a density provides many interesting advantages, including better capturing uncertainty about a representation and its relationships, expressing asymmetries more naturally than dot product or cosine similarity, and enabling more expressive parameterization of decision boundaries. This paper advocates for density-based distributed embeddings and presents a method for learning representations in the space of Gaussian distributions. We compare performance on various word embedding benchmarks, investigate the ability of these embeddings to model entailment and other asymmetric relationships, and explore novel properties of the representation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/29/2017

Embedding Words as Distributions with a Bayesian Skip-gram Model

We introduce a method for embedding words as probability densities in a ...
research
05/21/2021

Elliptical Ordinal Embedding

Ordinal embedding aims at finding a low dimensional representation of ob...
research
04/27/2017

Multimodal Word Distributions

Word embeddings provide point representations of words containing useful...
research
04/26/2018

Hierarchical Density Order Embeddings

By representing words with probability densities rather than point vecto...
research
08/14/2017

Continuous Representation of Location for Geolocation and Lexical Dialectology using Mixture Density Networks

We propose a method for embedding two-dimensional locations in a continu...
research
02/24/2017

Consistent Alignment of Word Embedding Models

Word embedding models offer continuous vector representations that can c...
research
06/09/2015

WordRank: Learning Word Embeddings via Robust Ranking

Embedding words in a vector space has gained a lot of attention in recen...

Please sign up or login with your details

Forgot password? Click here to reset