SoftTriple Loss: Deep Metric Learning Without Triplet Sampling

09/11/2019
by   Qi Qian, et al.
16

Distance metric learning (DML) is to learn the embeddings where examples from the same class are closer than examples from different classes. It can be cast as an optimization problem with triplet constraints. Due to the vast number of triplet constraints, a sampling strategy is essential for DML. With the tremendous success of deep learning in classifications, it has been applied for DML. When learning embeddings with deep neural networks (DNNs), only a mini-batch of data is available at each iteration. The set of triplet constraints has to be sampled within the mini-batch. Since a mini-batch cannot capture the neighbors in the original set well, it makes the learned embeddings sub-optimal. On the contrary, optimizing SoftMax loss, which is a classification loss, with DNN shows a superior performance in certain DML tasks. It inspires us to investigate the formulation of SoftMax. Our analysis shows that SoftMax loss is equivalent to a smoothed triplet loss where each class has a single center. In real-world data, one class can contain several local clusters rather than a single one, e.g., birds of different poses. Therefore, we propose the SoftTriple loss to extend the SoftMax loss with multiple centers for each class. Compared with conventional deep metric learning algorithms, optimizing SoftTriple loss can learn the embeddings without the sampling phase by mildly increasing the size of the last fully connected layer. Experiments on the benchmark fine-grained data sets demonstrate the effectiveness of the proposed loss function.

READ FULL TEXT

page 7

page 8

research
03/16/2018

Triplet-Center Loss for Multi-View 3D Object Retrieval

Most existing 3D object recognition algorithms focus on leveraging the s...
research
07/24/2018

Self-Paced Learning with Adaptive Deep Visual Embeddings

Selecting the most appropriate data examples to present a deep neural ne...
research
01/21/2022

Distance-Ratio-Based Formulation for Metric Learning

In metric learning, the goal is to learn an embedding so that data point...
research
04/18/2019

A Theoretically Sound Upper Bound on the Triplet Loss for Improving the Efficiency of Deep Distance Metric Learning

We propose a method that substantially improves the efficiency of deep d...
research
12/15/2022

NBC-Softmax : Darkweb Author fingerprinting and migration tracking

Metric learning aims to learn distances from the data, which enhances th...
research
07/10/2020

Batch-Incremental Triplet Sampling for Training Triplet Networks Using Bayesian Updating Theorem

Variants of Triplet networks are robust entities for learning a discrimi...
research
05/25/2018

Large-scale Distance Metric Learning with Uncertainty

Distance metric learning (DML) has been studied extensively in the past ...

Please sign up or login with your details

Forgot password? Click here to reset