DeepAI AI Chat
Log In Sign Up

The Fisher-Rao Loss for Learning under Label Noise

by   Henrique K. Miyamoto, et al.
University of Campinas

Choosing a suitable loss function is essential when learning by empirical risk minimisation. In many practical cases, the datasets used for training a classifier may contain incorrect labels, which prompts the interest for using loss functions that are inherently robust to label noise. In this paper, we study the Fisher-Rao loss function, which emerges from the Fisher-Rao distance in the statistical manifold of discrete distributions. We derive an upper bound for the performance degradation in the presence of label noise, and analyse the learning speed of this loss. Comparing with other commonly used losses, we argue that the Fisher-Rao loss provides a natural trade-off between robustness and training dynamics. Numerical experiments with synthetic and MNIST datasets illustrate this performance.


page 1

page 2

page 3

page 4


Robust Loss Functions under Label Noise for Deep Neural Networks

In many applications of classifier learning, training data suffers from ...

An Exploration into why Output Regularization Mitigates Label Noise

Label noise presents a real challenge for supervised learning algorithms...

Learning Not to Learn in the Presence of Noisy Labels

Learning in the presence of label noise is a challenging yet important t...

Classification with Noisy Labels by Importance Reweighting

In this paper, we study a classification problem in which sample labels ...

A Study of Deep CNN Model with Labeling Noise Based on Granular-ball Computing

In supervised learning, the presence of noise can have a significant imp...

Noise tolerance of learning to rank under class-conditional label noise

Often, the data used to train ranking models is subject to label noise. ...

Probabilistic orientation estimation with matrix Fisher distributions

This paper focuses on estimating probability distributions over the set ...