Mathematical Justification of Hard Negative Mining via Isometric Approximation Theorem

10/20/2022
by   Albert Xu, et al.
34

In deep metric learning, the Triplet Loss has emerged as a popular method to learn many computer vision and natural language processing tasks such as facial recognition, object detection, and visual-semantic embeddings. One issue that plagues the Triplet Loss is network collapse, an undesirable phenomenon where the network projects the embeddings of all data onto a single point. Researchers predominately solve this problem by using triplet mining strategies. While hard negative mining is the most effective of these strategies, existing formulations lack strong theoretical justification for their empirical success. In this paper, we utilize the mathematical theory of isometric approximation to show an equivalence between the Triplet Loss sampled by hard negative mining and an optimization problem that minimizes a Hausdorff-like distance between the neural network and its ideal counterpart function. This provides the theoretical justifications for hard negative mining's empirical efficacy. In addition, our novel application of the isometric approximation theorem provides the groundwork for future forms of hard negative mining that avoid network collapse. Our theory can also be extended to analyze other Euclidean space-based metric learning methods like Ladder Loss or Contrastive Learning.

READ FULL TEXT
research
04/05/2017

Smart Mining for Deep Metric Learning

To solve deep metric learning problems and producing feature embeddings,...
research
02/14/2022

Do Lessons from Metric Learning Generalize to Image-Caption Retrieval?

The triplet loss with semi-hard negatives has become the de facto choice...
research
11/27/2019

AdaSample: Adaptive Sampling of Hard Positives for Descriptor Learning

Triplet loss has been widely employed in a wide range of computer vision...
research
03/21/2017

No Fuss Distance Metric Learning using Proxies

We address the problem of distance metric learning (DML), defined as lea...
research
02/26/2020

A Quadruplet Loss for Enforcing Semantically Coherent Embeddings in Multi-output Classification Problems

This paper describes one objective function for learning semantically co...
research
10/18/2022

No Pairs Left Behind: Improving Metric Learning with Regularized Triplet Objective

We propose a novel formulation of the triplet objective function that im...
research
06/11/2020

Improving Deep Metric Learning with Virtual Classes and Examples Mining

In deep metric learning, the training procedure relies on sampling infor...

Please sign up or login with your details

Forgot password? Click here to reset