On Expected Accuracy

05/01/2019
by   Ozan Irsoy, et al.
8

We empirically investigate the (negative) expected accuracy as an alternative loss function to cross entropy (negative log likelihood) for classification tasks. Coupled with softmax activation, it has small derivatives over most of its domain, and is therefore hard to optimize. A modified, leaky version is evaluated on a variety of classification tasks, including digit recognition, image classification, sequence tagging and tree tagging, using a variety of neural architectures such as logistic regression, multilayer perceptron, CNN, LSTM and Tree-LSTM. We show that it yields comparable or better accuracy compared to cross entropy. Furthermore, the proposed objective is shown to be more robust to label noise.

READ FULL TEXT
research
04/27/2018

Negative Log Likelihood Ratio Loss for Deep Neural Network Classification

In deep neural network, the cross-entropy loss function is commonly used...
research
10/30/2020

What's in a Loss Function for Image Classification?

It is common to use the softmax cross-entropy loss to train neural netwo...
research
12/08/2021

Multiscale Softmax Cross Entropy for Fovea Localization on Color Fundus Photography

Fovea localization is one of the most popular tasks in ophthalmic medica...
research
07/13/2020

Implementing the ICE Estimator in Multilayer Perceptron Classifiers

This paper describes the techniques used to implement the ICE estimator ...
research
06/02/2013

Deep Learning using Linear Support Vector Machines

Recently, fully-connected and convolutional neural networks have been tr...
research
12/10/2021

PACMAN: PAC-style bounds accounting for the Mismatch between Accuracy and Negative log-loss

The ultimate performance of machine learning algorithms for classificati...
research
11/14/2022

Interpreting Bias in the Neural Networks: A Peek Into Representational Similarity

Neural networks trained on standard image classification data sets are s...

Please sign up or login with your details

Forgot password? Click here to reset