Desiderata for Vector-Space Word Representations

08/06/2016
by   Leon Derczynski, et al.
0

A plethora of vector-space representations for words is currently available, which is growing. These consist of fixed-length vectors containing real values, which represent a word. The result is a representation upon which the power of many conventional information processing and data mining techniques can be brought to bear, as long as the representations are designed with some forethought and fit certain constraints. This paper details desiderata for the design of vector space representations of words.

READ FULL TEXT

page 1

page 2

page 3

research
04/21/2018

Extrofitting: Enriching Word Representation and its Vector Space with Semantic Lexicons

We propose post-processing method for enriching not only word representa...
research
08/25/2016

Learning Latent Vector Spaces for Product Search

We introduce a novel latent vector space model that jointly learns the l...
research
11/19/2015

Compressing Word Embeddings

Recent methods for learning vector space representations of words have s...
research
07/12/2018

Tracking the Evolution of Words with Time-reflective Text Representations

More than 80 unstructured datasets evolving over time. A large part of t...
research
03/02/2016

Counter-fitting Word Vectors to Linguistic Constraints

In this work, we present a novel counter-fitting method which injects an...
research
11/02/2016

Fuzzy paraphrases in learning word representations with a lexicon

A synonym of a polysemous word is usually only the paraphrase of one sen...
research
07/28/2015

Reasoning about Linguistic Regularities in Word Embeddings using Matrix Manifolds

Recent work has explored methods for learning continuous vector space wo...

Please sign up or login with your details

Forgot password? Click here to reset