von Mises-Fisher Loss: An Exploration of Embedding Geometries for Supervised Learning

by   Tyler R. Scott, et al.

Recent work has argued that classification losses utilizing softmax cross-entropy are superior not only for fixed-set classification tasks, but also by outperforming losses developed specifically for open-set tasks including few-shot learning and retrieval. Softmax classifiers have been studied using different embedding geometries – Euclidean, hyperbolic, and spherical – and claims have been made about the superiority of one or another, but they have not been systematically compared with careful controls. We conduct an empirical investigation of embedding geometry on softmax losses for a variety of fixed-set classification and image retrieval tasks. An interesting property observed for the spherical losses lead us to propose a probabilistic classifier based on the von Mises-Fisher distribution, and we show that it is competitive with state-of-the-art methods while producing improved out-of-the-box calibration. We provide guidance regarding the trade-offs between losses and how to choose among them.



There are no comments yet.


page 3

page 4

page 18

page 19

page 20


Hyperbolic Image Embeddings

Computer vision tasks such as image classification, image retrieval and ...

Directional Statistics-based Deep Metric Learning for Image Classification and Retrieval

Deep distance metric learning (DDML), which is proposed to learn image s...

An Exploration of Softmax Alternatives Belonging to the Spherical Loss Family

In a multi-class classification problem, it is standard to model the out...

Dissecting Supervised Constrastive Learning

Minimizing cross-entropy over the softmax scores of a linear map compose...

Ensemble Soft-Margin Softmax Loss for Image Classification

Softmax loss is arguably one of the most popular losses to train CNN mod...

Energy-Based Open-World Uncertainty Modeling for Confidence Calibration

Confidence calibration is of great importance to the reliability of deci...

Trade-offs in Top-k Classification Accuracies on Losses for Deep Learning

This paper presents an experimental analysis about trade-offs in top-k c...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.