End-to-End Efficient Representation Learning via Cascading Combinatorial Optimization

02/28/2019
by   Yeonwoo Jeong, et al.
12

We develop hierarchically quantized efficient embedding representations for similarity-based search and show that this representation provides not only the state of the art performance on the search accuracy but also provides several orders of speed up during inference. The idea is to hierarchically quantize the representation so that the quantization granularity is greatly increased while maintaining the accuracy and keeping the computational complexity low. We also show that the problem of finding the optimal sparse compound hash code respecting the hierarchical structure can be optimized in polynomial time via minimum cost flow in an equivalent flow network. This allows us to train the method end-to-end in a mini-batch stochastic gradient descent setting. Our experiments on Cifar100 and ImageNet datasets show the state of the art search accuracy while providing several orders of magnitude search speedup respectively over exhaustive linear search over the dataset.

READ FULL TEXT
research
05/15/2018

Efficient end-to-end learning for quantizable representations

Embedding representation learning via neural networks is at the core fou...
research
10/30/2016

Accurate Deep Representation Quantization with Gradient Snapping Layer for Similarity Search

Recent advance of large scale similarity search involves using deeply le...
research
02/08/2020

BitPruning: Learning Bitlengths for Aggressive and Accurate Quantization

Neural networks have demonstrably achieved state-of-the art accuracy usi...
research
11/18/2019

vqSGD: Vector Quantized Stochastic Gradient Descent

In this work, we present a family of vector quantization schemes vqSGD (...
research
12/11/2019

End-to-End Learning of Geometrical Shaping Maximizing Generalized Mutual Information

GMI-based end-to-end learning is shown to be highly nonconvex. We apply ...
research
07/03/2018

Learning concise representations for regression by evolving networks of trees

We propose and study a method for learning interpretable representations...
research
03/09/2022

Givens Coordinate Descent Methods for Rotation Matrix Learning in Trainable Embedding Indexes

Product quantization (PQ) coupled with a space rotation, is widely used ...

Please sign up or login with your details

Forgot password? Click here to reset