Loss Functions for Classification using Structured Entropy

06/14/2022
by   Brian Lucena, et al.
0

Cross-entropy loss is the standard metric used to train classification models in deep learning and gradient boosting. It is well-known that this loss function fails to account for similarities between the different values of the target. We propose a generalization of entropy called structured entropy which uses a random partition to incorporate the structure of the target variable in a manner which retains many theoretical properties of standard entropy. We show that a structured cross-entropy loss yields better results on several classification problems where the target variable has an a priori known structure. The approach is simple, flexible, easily computable, and does not rely on a hierarchically defined notion of structure.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/11/2018

Taming the Cross Entropy Loss

We present the Tamed Cross Entropy (TCE) loss function, a robust derivat...
research
10/29/2022

Reformulating van Rijsbergen's F_β metric for weighted binary cross-entropy

The separation of performance metrics from gradient based loss functions...
research
04/16/2022

The Tree Loss: Improving Generalization with Many Classes

Multi-class classification problems often have many semantically similar...
research
07/16/2020

Amended Cross Entropy Cost: Framework For Explicit Diversity Encouragement

Cross Entropy (CE) has an important role in machine learning and, in par...
research
11/10/2020

Uses and Abuses of the Cross-Entropy Loss: Case Studies in Modern Deep Learning

Modern deep learning is primarily an experimental science, in which empi...
research
04/26/2022

PolyLoss: A Polynomial Expansion Perspective of Classification Loss Functions

Cross-entropy loss and focal loss are the most common choices when train...
research
09/22/2020

Role of Orthogonality Constraints in Improving Properties of Deep Networks for Image Classification

Standard deep learning models that employ the categorical cross-entropy ...

Please sign up or login with your details

Forgot password? Click here to reset