Equalizing Financial Impact in Supervised Learning

06/24/2018
by   Govind Ramnarayan, et al.
0

Notions of "fair classification" that have arisen in computer science generally revolve around equalizing certain statistics across protected groups. This approach has been criticized as ignoring societal issues, including how errors can hurt certain groups disproportionately. We pose a modification of one of the fairness criteria from Hardt, Price, and Srebro [NIPS, 2016] that makes a small step towards addressing this issue in the case of financial decisions like giving loans. We call this new notion "equalized financial impact."

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/26/2021

Fair Sequential Selection Using Supervised Learning Models

We consider a selection problem where sequentially arrived applicants ap...
research
02/21/2020

Robust Optimization for Fairness with Noisy Protected Groups

Many existing fairness criteria for machine learning involve equalizing ...
research
01/29/2019

Towards Fair Deep Clustering With Multi-State Protected Variables

Fair clustering under the disparate impact doctrine requires that popula...
research
05/13/2019

The Price of Fairness for Indivisible Goods

We investigate the efficiency of fair allocations of indivisible goods u...
research
04/23/2019

Copula estimation for nonsynchronous financial data

Copula is a powerful tool to model multivariate data. Due to its several...
research
07/21/2022

Cryptographic and Financial Fairness

A recent trend in multi-party computation is to achieve cryptographic fa...
research
07/25/2023

The Double-Edged Sword of Big Data and Information Technology for the Disadvantaged: A Cautionary Tale from Open Banking

This research article analyses and demonstrates the hidden implications ...

Please sign up or login with your details

Forgot password? Click here to reset