DeepAI
Log In Sign Up

A Heaviside Function Approximation for Neural Network Binary Classification

09/02/2020
by   Nathan Tsoi, et al.
22

Neural network binary classifiers are often evaluated on metrics like accuracy and F_1-Score, which are based on confusion matrix values (True Positives, False Positives, False Negatives, and True Negatives). However, these classifiers are commonly trained with a different loss, e.g. log loss. While it is preferable to perform training on the same loss as the evaluation metric, this is difficult in the case of confusion matrix based metrics because set membership is a step function without a derivative useful for backpropagation. To address this challenge, we propose an approximation of the step function that adheres to the properties necessary for effective training of binary networks using confusion matrix based metrics. This approach allows for end-to-end training of binary deep neural classifiers via batch gradient descent. We demonstrate the flexibility of this approach in several applications with varying levels of class imbalance. We also demonstrate how the approximation allows balancing between precision and recall in the appropriate ratio for the task at hand.

READ FULL TEXT

page 1

page 2

page 3

page 4

page 5

page 6

page 7

page 8

04/06/2022

Emphasis on the Minimization of False Negatives or False Positives in Binary Classification

The minimization of specific cases in binary classification, such as fal...
10/21/2022

Extending F_1 metric, probabilistic approach

This article explores the extension of well-known F_1 score used for ass...
09/06/2018

Yes, IoU loss is submodular - as a function of the mispredictions

This note is a response to [7] in which it is claimed that [13, Proposit...
08/24/2021

sigmoidF1: A Smooth F1 Score Surrogate Loss for Multilabel Classification

Multiclass multilabel classification refers to the task of attributing m...
02/21/2021

Constrained Optimization for Training Deep Neural Networks Under Class Imbalance

Deep neural networks (DNNs) are notorious for making more mistakes for t...
12/04/2022

Label Encoding for Regression Networks

Deep neural networks are used for a wide range of regression problems. H...
09/14/2022

Meta Pattern Concern Score: A Novel Metric for Customizable Evaluation of Multi-classification

Classifiers have been widely implemented in practice, while how to evalu...