Bernoulli Embeddings for Graphs

03/25/2018
by   Vinith Misra, et al.
0

Just as semantic hashing can accelerate information retrieval, binary valued embeddings can significantly reduce latency in the retrieval of graphical data. We introduce a simple but effective model for learning such binary vectors for nodes in a graph. By imagining the embeddings as independent coin flips of varying bias, continuous optimization techniques can be applied to the approximate expected loss. Embeddings optimized in this fashion consistently outperform the quantization of both spectral graph embeddings and various learned real-valued embeddings, on both ranking and pre-ranking tasks for a variety of datasets.

READ FULL TEXT
research
03/08/2018

Learning Effective Binary Visual Representations with Deep Networks

Although traditionally binary visual representations are mainly designed...
research
08/12/2019

SHREWD: Semantic Hierarchy-based Relational Embeddings for Weakly-supervised Deep Hashing

Using class labels to represent class similarity is a typical approach t...
research
01/11/2020

Embedding Compression with Isotropic Iterative Quantization

Continuous representation of words is a standard component in deep learn...
research
08/15/2019

Hamming Sentence Embeddings for Information Retrieval

In retrieval applications, binary hashes are known to offer significant ...
research
01/25/2023

An Approximate Algorithm for Maximum Inner Product Search over Streaming Sparse Vectors

Maximum Inner Product Search or top-k retrieval on sparse vectors is wel...
research
10/31/2022

Efficient Document Retrieval by End-to-End Refining and Quantizing BERT Embedding with Contrastive Product Quantization

Efficient document retrieval heavily relies on the technique of semantic...
research
12/04/2019

Binarized Canonical Polyadic Decomposition for Knowledge Graph Completion

Methods based on vector embeddings of knowledge graphs have been activel...

Please sign up or login with your details

Forgot password? Click here to reset