A unifying approach on bias and variance analysis for classification

01/05/2021
by   Cemre Zor, et al.
0

Standard bias and variance (B V) terminologies were originally defined for the regression setting and their extensions to classification have led to several different models / definitions in the literature. In this paper, we aim to provide the link between the commonly used frameworks of Tumer Ghosh (T G) and James. By unifying the two approaches, we relate the B V defined for the 0/1 loss to the standard B V of the boundary distributions given for the squared error loss. The closed form relationships provide a deeper understanding of classification performance, and their use is demonstrated in two case studies.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/08/2022

Understanding the bias-variance tradeoff of Bregman divergences

This paper builds upon the work of Pfau (2013), which generalized the bi...
research
06/21/2020

Asymptotic properties of Bernstein estimators on the simplex. Part 2: the boundary case

In this paper, we study the asymptotic properties (bias, variance, mean ...
research
05/29/2023

On the Variance, Admissibility, and Stability of Empirical Risk Minimization

It is well known that Empirical Risk Minimization (ERM) with squared los...
research
10/24/2021

Learning to Estimate Without Bias

We consider the use of deep learning for parameter estimation. We propos...
research
08/01/2020

Vulnerability Under Adversarial Machine Learning: Bias or Variance?

Prior studies have unveiled the vulnerability of the deep neural network...
research
09/24/2011

Bias Plus Variance Decomposition for Survival Analysis Problems

Bias - variance decomposition of the expected error defined for regressi...
research
06/29/2018

LTL Store: Repository of LTL formulae from literature and case studies

This continuously extended technical report collects and compares common...

Please sign up or login with your details

Forgot password? Click here to reset