On Symmetric Losses for Learning from Corrupted Labels

01/27/2019
by   Nontawat Charoenphakdee, et al.
0

This paper aims to provide a better understanding of a symmetric loss. First, we show that using a symmetric loss is advantageous in the balanced error rate (BER) minimization and area under the receiver operating characteristic curve (AUC) maximization from corrupted labels. Second, we prove general theoretical properties of symmetric losses, including a classification-calibration condition, excess risk bound, conditional risk minimizer, and AUC-consistency condition. Third, since all nonnegative symmetric losses are non-convex, we propose a convex barrier hinge loss that benefits significantly from the symmetric condition, although it is not symmetric everywhere. Finally, we conduct experiments on BER and AUC optimization from corrupted labels to validate the relevance of the symmetric condition.

READ FULL TEXT
research
01/05/2021

A Symmetric Loss Perspective of Reliable Machine Learning

When minimizing the empirical risk in binary classification, it is a com...
research
02/14/2023

On Classification-Calibration of Gamma-Phi Losses

Gamma-Phi losses constitute a family of multiclass classification loss f...
research
08/03/2012

On the Consistency of AUC Pairwise Optimization

AUC (area under ROC curve) is an important evaluation criterion, which h...
research
08/28/2019

Stochastic AUC Maximization with Deep Neural Networks

Stochastic AUC maximization has garnered an increasing interest due to b...
research
02/09/2014

A Hybrid Loss for Multiclass and Structured Prediction

We propose a novel hybrid loss for multiclass and structured prediction ...
research
12/12/2011

Threshold Choice Methods: the Missing Link

Many performance metrics have been introduced for the evaluation of clas...
research
05/29/2018

MBA: Mini-Batch AUC Optimization

Area under the receiver operating characteristics curve (AUC) is an impo...

Please sign up or login with your details

Forgot password? Click here to reset