von Mises-Fisher Loss: An Exploration of Embedding Geometries for Supervised Learning

03/29/2021
by   Tyler R. Scott, et al.
0

Recent work has argued that classification losses utilizing softmax cross-entropy are superior not only for fixed-set classification tasks, but also by outperforming losses developed specifically for open-set tasks including few-shot learning and retrieval. Softmax classifiers have been studied using different embedding geometries – Euclidean, hyperbolic, and spherical – and claims have been made about the superiority of one or another, but they have not been systematically compared with careful controls. We conduct an empirical investigation of embedding geometry on softmax losses for a variety of fixed-set classification and image retrieval tasks. An interesting property observed for the spherical losses lead us to propose a probabilistic classifier based on the von Mises-Fisher distribution, and we show that it is competitive with state-of-the-art methods while producing improved out-of-the-box calibration. We provide guidance regarding the trade-offs between losses and how to choose among them.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 3

page 4

page 18

page 19

page 20

04/03/2019

Hyperbolic Image Embeddings

Computer vision tasks such as image classification, image retrieval and ...
02/27/2018

Directional Statistics-based Deep Metric Learning for Image Classification and Retrieval

Deep distance metric learning (DDML), which is proposed to learn image s...
11/16/2015

An Exploration of Softmax Alternatives Belonging to the Spherical Loss Family

In a multi-class classification problem, it is standard to model the out...
02/17/2021

Dissecting Supervised Constrastive Learning

Minimizing cross-entropy over the softmax scores of a linear map compose...
05/10/2018

Ensemble Soft-Margin Softmax Loss for Image Classification

Softmax loss is arguably one of the most popular losses to train CNN mod...
07/27/2021

Energy-Based Open-World Uncertainty Modeling for Confidence Calibration

Confidence calibration is of great importance to the reliability of deci...
07/30/2020

Trade-offs in Top-k Classification Accuracies on Losses for Deep Learning

This paper presents an experimental analysis about trade-offs in top-k c...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.