High-Dimensional Vector Semantics

02/23/2018
by   M. Andrecut, et al.
0

In this paper we explore the "vector semantics" problem from the perspective of "almost orthogonal" property of high-dimensional random vectors. We show that this intriguing property can be used to "memorize" random vectors by simply adding them, and we provide an efficient probabilistic solution to the set membership problem. Also, we discuss several applications to word context vector embeddings, document sentences similarity, and spam filtering.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/16/2018

Exploring Sentence Vector Spaces through Automatic Summarization

Given vector representations for individual words, it is necessary to co...
research
02/07/2021

Additive Feature Hashing

The hashing trick is a machine learning technique used to encode categor...
research
08/30/2020

SOLAR: Sparse Orthogonal Learned and Random Embeddings

Dense embedding models are commonly deployed in commercial search engine...
research
09/24/2018

Representing Sets as Summed Semantic Vectors

Representing meaning in the form of high dimensional vectors is a common...
research
12/10/2014

Memory vectors for similarity search in high-dimensional spaces

We study an indexing architecture to store and search in a database of h...
research
12/22/2014

Language Recognition using Random Indexing

Random Indexing is a simple implementation of Random Projections with a ...
research
02/10/2022

Understanding Hyperdimensional Computing for Parallel Single-Pass Learning

Hyperdimensional computing (HDC) is an emerging learning paradigm that c...

Please sign up or login with your details

Forgot password? Click here to reset