On the relation between accuracy and fairness in binary classification

05/21/2015
by   Indre Zliobaite, et al.
0

Our study revisits the problem of accuracy-fairness tradeoff in binary classification. We argue that comparison of non-discriminatory classifiers needs to account for different rates of positive predictions, otherwise conclusions about performance may be misleading, because accuracy and discrimination of naive baselines on the same dataset vary with different rates of positive predictions. We provide methodological recommendations for sound comparison of non-discriminatory classifiers, and present a brief theoretical and empirical analysis of tradeoffs between accuracy and non-discrimination.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/21/2020

How to Control the Error Rates of Binary Classifiers

The traditional binary classification framework constructs classifiers w...
research
06/30/2017

Learning Fair Classifiers: A Regularization-Inspired Approach

We present a regularization-inspired approach for reducing bias in learn...
research
02/25/2013

Phoneme discrimination using KS-algebra II

KS-algebra consists of expressions constructed with four kinds operation...
research
08/23/2020

A critical assessment of conformal prediction methods applied in binary classification settings

In recent years there has been an increase in the number of scientific p...
research
07/27/2021

Statistical Guarantees for Fairness Aware Plug-In Algorithms

A plug-in algorithm to estimate Bayes Optimal Classifiers for fairness-a...
research
05/30/2018

Why Is My Classifier Discriminatory?

Recent attempts to achieve fairness in predictive models focus on the ba...

Please sign up or login with your details

Forgot password? Click here to reset