Circle Loss: A Unified Perspective of Pair Similarity Optimization

02/25/2020
by   Yifan Sun, et al.
10

This paper provides a pair similarity optimization viewpoint on deep feature learning, aiming to maximize the within-class similarity s_p and minimize the between-class similarity s_n. We find a majority of loss functions, including the triplet loss and the softmax plus cross-entropy loss, embed s_n and s_p into similarity pairs and seek to reduce (s_n-s_p). Such an optimization manner is inflexible, because the penalty strength on every single similarity score is restricted to be equal. Our intuition is that if a similarity score deviates far from the optimum, it should be emphasized. To this end, we simply re-weight each similarity to highlight the less-optimized similarity scores. It results in a Circle loss, which is named due to its circular decision boundary. The Circle loss has a unified formula for two elemental deep feature learning approaches, i.e. learning with class-level labels and pair-wise labels. Analytically, we show that the Circle loss offers a more flexible optimization approach towards a more definite convergence target, compared with the loss functions optimizing (s_n-s_p). Experimentally, we demonstrate the superiority of the Circle loss on a variety of deep feature learning tasks. On face recognition, person re-identification, as well as several fine-grained image retrieval datasets, the achieved performance is on par with the state of the art.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/22/2022

Unified Negative Pair Generation toward Well-discriminative Feature Space for Face Recognition

The goal of face recognition (FR) can be viewed as a pair similarity opt...
research
09/28/2022

Unified Loss of Pair Similarity Optimization for Vision-Language Retrieval

There are two popular loss functions used for vision-language retrieval,...
research
11/17/2019

Distribution Context Aware Loss for Person Re-identification

To learn the optimal similarity function between probe and gallery image...
research
03/08/2018

Instance Similarity Deep Hashing for Multi-Label Image Retrieval

Hash coding has been widely used in the approximate nearest neighbor sea...
research
03/25/2021

Exploiting Class Similarity for Machine Learning with Confidence Labels and Projective Loss Functions

Class labels used for machine learning are relatable to each other, with...
research
04/14/2019

Multi-Similarity Loss with General Pair Weighting for Deep Metric Learning

A family of loss functions built on pair-based computation have been pro...
research
11/02/2016

Learning Deep Embeddings with Histogram Loss

We suggest a loss for learning deep embeddings. The new loss does not in...

Please sign up or login with your details

Forgot password? Click here to reset