A Quadruplet Loss for Enforcing Semantically Coherent Embeddings in Multi-output Classification Problems

02/26/2020
by   Hugo Proenca, et al.
15

This paper describes one objective function for learning semantically coherent feature embeddings in multi-output classification problems, i.e., when the response variables have dimension higher than one. In particular, we consider the problems of identity retrieval and soft biometrics in visual surveillance environments, which have been attracting growing interests. Inspired by the triplet loss function, we propose a generalization of that concept: a quadruplet loss, that 1) defines a metric that analyzes the number of agreeing labels between pairs of elements; and 2) disregards the notion of anchor, replacing d(A1,A2) < d(A1,B) by d(A,B) < d(C,D) distance constraints, according to such perceived semantic similarity between the elements of each pair. Inherited from the triplet loss formulation, our proposal also privileges small distances between positive pairs, but also explicitly enforces that the distances between negative pairs directly correspond to their similarity in terms of the number of agreeing labels. This typically yields feature embeddings with a strong correspondence between the classes centroids and their semantic descriptions, i.e., where elements that share some of the labels are closer to each other in the destiny space than elements with fully disjoint classes membership. Also, in opposition to its triplet counterpart, the proposed loss is not particularly sensitive to the way learning pairs are mined, being agnostic with regard to demanding criteria for mining learning instances (such as the semi-hard pairs of triplet loss). Our experiments were carried out in four different datasets (BIODI, LFW, Megaface and PETA) and validate our assumptions, showing highly promising results.

READ FULL TEXT

page 1

page 4

page 5

page 6

research
01/12/2021

Learning Efficient Representations for Keyword Spotting with Triplet Loss

In the past few years, triplet loss-based metric embeddings have become ...
research
07/24/2020

Hard negative examples are hard, but useful

Triplet loss is an extremely common approach to distance metric learning...
research
03/21/2017

No Fuss Distance Metric Learning using Proxies

We address the problem of distance metric learning (DML), defined as lea...
research
10/20/2022

Mathematical Justification of Hard Negative Mining via Isometric Approximation Theorem

In deep metric learning, the Triplet Loss has emerged as a popular metho...
research
05/19/2017

Quadruplet Network with One-Shot Learning for Visual Tracking

As a discriminative method of one-shot learning, Siamese deep network al...
research
11/02/2016

Learning Deep Embeddings with Histogram Loss

We suggest a loss for learning deep embeddings. The new loss does not in...

Please sign up or login with your details

Forgot password? Click here to reset