Amended Cross Entropy Cost: Framework For Explicit Diversity Encouragement

07/16/2020
by   Ron Shoham, et al.
0

Cross Entropy (CE) has an important role in machine learning and, in particular, in neural networks. It is commonly used in neural networks as the cost between the known distribution of the label and the Softmax/Sigmoid output. In this paper we present a new cost function called the Amended Cross Entropy (ACE). Its novelty lies in its affording the capability to train multiple classifiers while explicitly controlling the diversity between them. We derived the new cost by mathematical analysis and "reverse engineering" of the way we wish the gradients to behave, and produced a tailor-made, elegant and intuitive cost function to achieve the desired result. This process is similar to the way that CE cost is picked as a cost function for the Softmax/Sigmoid classifiers for obtaining linear derivatives. By choosing the optimal diversity factor we produce an ensemble which yields better results than the vanilla one. We demonstrate two potential usages of this outcome, and present empirical results. Our method works for classification problems analogously to Negative Correlation Learning (NCL) for regression problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/19/2021

Neural Network Classifier as Mutual Information Evaluator

Cross-entropy loss with softmax output is a standard choice to train neu...
research
06/14/2022

Loss Functions for Classification using Structured Entropy

Cross-entropy loss is the standard metric used to train classification m...
research
05/18/2020

Niose-Sampling Cross Entropy Loss: Improving Disparity Regression Via Cost Volume Aware Regularizer

Recent end-to-end deep neural networks for disparity regression have ach...
research
04/23/2017

A General Theory for Training Learning Machine

Though the deep learning is pushing the machine learning to a new stage,...
research
10/14/2020

Temperature check: theory and practice for training models with softmax-cross-entropy losses

The softmax function combined with a cross-entropy loss is a principled ...
research
12/08/2021

Multiscale Softmax Cross Entropy for Fovea Localization on Color Fundus Photography

Fovea localization is one of the most popular tasks in ophthalmic medica...
research
04/27/2022

Gleo-Det: Deep Convolution Feature-Guided Detector with Local Entropy Optimization for Salient Points

Feature detection is an important procedure for image matching, where un...

Please sign up or login with your details

Forgot password? Click here to reset