Semantic Cluster Unary Loss for Efficient Deep Hashing

05/15/2018
by   Shifeng Zhang, et al.
0

Hashing method maps similar data to binary hashcodes with smaller hamming distance, which has received a broad attention due to its low storage cost and fast retrieval speed. With the rapid development of deep learning, deep hashing methods have achieved promising results in efficient information retrieval. Most of the existing deep hashing methods adopt pairwise or triplet losses to deal with similarities underlying the data, but the training is difficult and less efficient because O(n^2) data pairs and O(n^3) triplets are involved. To address these issues, we propose a novel deep hashing algorithm with unary loss which can be trained very efficiently. We first of all introduce a Unary Upper Bound of the traditional triplet loss, thus reducing the complexity to O(n) and bridging the classification-based unary loss and the triplet loss. Second, we propose a novel Semantic Cluster Deep Hashing (SCDH) algorithm by introducing a modified Unary Upper Bound loss, named Semantic Cluster Unary Loss (SCUL). The resultant hashcodes form several compact clusters, which means hashcodes in the same cluster have similar semantic information. We also demonstrate that the proposed SCDH is easy to be extended to semi-supervised settings by incorporating the state-of-the-art semi-supervised learning algorithms. Experiments on large-scale datasets show that the proposed method is superior to state-of-the-art hashing algorithms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/02/2019

Joint Cluster Unary Loss for Efficient Cross-Modal Hashing

With the rapid growth of various types of multimodal data, cross-modal d...
research
02/02/2019

Pairwise Teacher-Student Network for Semi-Supervised Hashing

Hashing method maps similar high-dimensional data to binary hashcodes wi...
research
09/28/2016

Scalable Discrete Supervised Hash Learning with Asymmetric Matrix Factorization

Hashing method maps similar data to binary hashcodes with smaller hammin...
research
03/12/2018

Deep Class-Wise Hashing: Semantics-Preserving Hashing via Class-wise Loss

Deep supervised hashing has emerged as an influential solution to large-...
research
09/21/2016

How should we evaluate supervised hashing?

Hashing produces compact representations for documents, to perform tasks...
research
09/07/2018

Neurons Merging Layer: Towards Progressive Redundancy Reduction for Deep Supervised Hashing

Deep supervised hashing has become an active topic in web search and inf...
research
12/17/2019

DeepHashing using TripletLoss

Hashing is one of the most efficient techniques for approximate nearest ...

Please sign up or login with your details

Forgot password? Click here to reset