Quantization based Fast Inner Product Search

09/04/2015
by   Ruiqi Guo, et al.
0

We propose a quantization based approach for fast approximate Maximum Inner Product Search (MIPS). Each database vector is quantized in multiple subspaces via a set of codebooks, learned directly by minimizing the inner product quantization error. Then, the inner product of a query to a database vector is approximated as the sum of inner products with the subspace quantizers. Different from recently proposed LSH approaches to MIPS, the database vectors and queries do not need to be augmented in a higher dimensional feature space. We also provide a theoretical analysis of the proposed approach, consisting of the concentration results under mild assumptions. Furthermore, if a small sample of example queries is given at the training time, we propose a modified codebook learning procedure which further improves the accuracy. Experimental results on a variety of datasets including those arising from deep neural networks show that the proposed approach significantly outperforms the existing state-of-the-art.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/27/2019

New Loss Functions for Fast Maximum Inner Product Search

Quantization based methods are popular for solving large scale maximum i...
research
11/12/2019

Norm-Explicit Quantization: Improving Vector Quantization for Maximum Inner Product Search

Vector quantization (VQ) techniques are widely used in similarity search...
research
06/19/2014

Inner Product Similarity Search using Compositional Codes

This paper addresses the nearest neighbor search problem under inner pro...
research
12/03/2021

Projective Clustering Product Quantization

This paper suggests the use of projective clustering based product quant...
research
08/23/2019

Revisiting Wedge Sampling for Budgeted Maximum Inner Product Search

Top-k maximum inner product search (MIPS) is a central task in many mach...
research
12/15/2018

A Bandit Approach to Maximum Inner Product Search

There has been substantial research on sub-linear time approximate algor...
research
04/30/2019

PR Product: A Substitute for Inner Product in Neural Networks

In this paper, we analyze the inner product of weight vector and input v...

Please sign up or login with your details

Forgot password? Click here to reset