von Mises-Fisher Loss: An Exploration of Embedding Geometries for Supervised Learning

03/29/2021
by   Tyler R. Scott, et al.
0

Recent work has argued that classification losses utilizing softmax cross-entropy are superior not only for fixed-set classification tasks, but also by outperforming losses developed specifically for open-set tasks including few-shot learning and retrieval. Softmax classifiers have been studied using different embedding geometries – Euclidean, hyperbolic, and spherical – and claims have been made about the superiority of one or another, but they have not been systematically compared with careful controls. We conduct an empirical investigation of embedding geometry on softmax losses for a variety of fixed-set classification and image retrieval tasks. An interesting property observed for the spherical losses lead us to propose a probabilistic classifier based on the von Mises-Fisher distribution, and we show that it is competitive with state-of-the-art methods while producing improved out-of-the-box calibration. We provide guidance regarding the trade-offs between losses and how to choose among them.

READ FULL TEXT

page 3

page 4

page 18

page 19

page 20

research
04/03/2019

Hyperbolic Image Embeddings

Computer vision tasks such as image classification, image retrieval and ...
research
10/30/2022

Learning to Defer to Multiple Experts: Consistent Surrogate Losses, Confidence Calibration, and Conformal Ensembles

We study the statistical properties of learning to defer (L2D) to multip...
research
02/27/2018

Directional Statistics-based Deep Metric Learning for Image Classification and Retrieval

Deep distance metric learning (DDML), which is proposed to learn image s...
research
02/17/2021

Dissecting Supervised Constrastive Learning

Minimizing cross-entropy over the softmax scores of a linear map compose...
research
07/27/2021

Energy-Based Open-World Uncertainty Modeling for Confidence Calibration

Confidence calibration is of great importance to the reliability of deci...
research
10/10/2018

Complementary-Label Learning for Arbitrary Losses and Models

In contrast to the standard classification paradigm where the true (or p...
research
07/30/2020

Trade-offs in Top-k Classification Accuracies on Losses for Deep Learning

This paper presents an experimental analysis about trade-offs in top-k c...

Please sign up or login with your details

Forgot password? Click here to reset