Fairness Testing: Testing Software for Discrimination

09/11/2017
by   Sainyam Galhotra, et al.
0

This paper defines software fairness and discrimination and develops a testing-based method for measuring if and how much software discriminates, focusing on causality in discriminatory behavior. Evidence of software discrimination has been found in modern software systems that recommend criminal sentences, grant access to financial products, and determine who is allowed to participate in promotions. Our approach, Themis, generates efficient test suites to measure discrimination. Given a schema describing valid system inputs, Themis generates discrimination tests automatically and does not require an oracle. We evaluate Themis on 20 software systems, 12 of which come from prior work with explicit focus on avoiding discrimination. We find that (1) Themis is effective at discovering software discrimination, (2) state-of-the-art techniques for removing discrimination from algorithms fail in many situations, at times discriminating against as much as 98 subdomain, (3) Themis optimizations are effective at producing efficient test suites for measuring discrimination, and (4) Themis is more efficient on systems that exhibit more discrimination. We thus demonstrate that fairness testing is a critical aspect of the software development cycle in domains with possible discrimination and provide initial tools for measuring software discrimination.

READ FULL TEXT

page 1

page 2

page 3

page 4

08/24/2022

TESTSGD: Interpretable Testing of Neural Networks Against Subtle Group Discrimination

Discrimination has been shown in many machine learning applications, whi...
07/17/2021

Automatic Fairness Testing of Neural Classifiers through Adversarial Sampling

Although deep learning has demonstrated astonishing performance in many ...
05/14/2019

Software Engineering for Fairness: A Case Study with Hyperparameter Optimization

We assert that it is the ethical duty of software engineers to strive to...
07/20/2022

Fairness Testing: A Comprehensive Survey and Analysis of Trends

Software systems are vulnerable to fairness bugs and frequently exhibit ...
09/17/2022

Enhanced Fairness Testing via Generating Effective Initial Individual Discriminatory Instances

Fairness testing aims at mitigating unintended discrimination in the dec...
09/10/2018

Automated Test Generation to Detect Individual Discrimination in AI Models

Dependability on AI models is of utmost importance to ensure full accept...
07/14/2020

A Normative approach to Attest Digital Discrimination

Digital discrimination is a form of discrimination whereby users are aut...