One Loss for All: Deep Hashing with a Single Cosine Similarity based Learning Objective

09/29/2021
by   Jiun Tian Hoe, et al.
0

A deep hashing model typically has two main learning objectives: to make the learned binary hash codes discriminative and to minimize a quantization error. With further constraints such as bit balance and code orthogonality, it is not uncommon for existing models to employ a large number (>4) of losses. This leads to difficulties in model training and subsequently impedes their effectiveness. In this work, we propose a novel deep hashing model with only a single learning objective. Specifically, we show that maximizing the cosine similarity between the continuous codes and their corresponding binary orthogonal codes can ensure both hash code discriminativeness and quantization error minimization. Further, with this learning objective, code balancing can be achieved by simply using a Batch Normalization (BN) layer and multi-label classification is also straightforward with label smoothing. The result is an one-loss deep hashing model that removes all the hassles of tuning the weights of various losses. Importantly, extensive experiments show that our model is highly effective, outperforming the state-of-the-art multi-loss hashing models on three large-scale instance retrieval benchmarks, often by significant margins. Code is available at https://github.com/kamwoh/orthohash

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/16/2021

Self-Distilled Hashing for Deep Image Retrieval

In hash-based image retrieval systems, the transformed input from the or...
research
10/24/2021

Deep Asymmetric Hashing with Dual Semantic Regression and Class Structure Quantization

Recently, deep hashing methods have been widely used in image retrieval ...
research
05/31/2022

One Loss for Quantization: Deep Hashing with Discrete Wasserstein Distributional Matching

Image hashing is a principled approximate nearest neighbor approach to f...
research
02/15/2023

Unsupervised Hashing via Similarity Distribution Calibration

Existing unsupervised hashing methods typically adopt a feature similari...
research
11/27/2018

A Scalable Optimization Mechanism for Pairwise based Discrete Hashing

Maintaining the pair similarity relationship among originally high-dimen...
research
05/11/2019

Hadamard Matrix Guided Online Hashing

Online image hashing has received increasing research attention recently...
research
07/17/2020

Self-Supervised Bernoulli Autoencoders for Semi-Supervised Hashing

Semantic hashing is an emerging technique for large-scale similarity sea...

Please sign up or login with your details

Forgot password? Click here to reset