Differentially Private and Fair Deep Learning: A Lagrangian Dual Approach

09/26/2020
by   Cuong Tran, et al.
6

A critical concern in data-driven decision making is to build models whose outcomes do not discriminate against some demographic groups, including gender, ethnicity, or age. To ensure non-discrimination in learning tasks, knowledge of the sensitive attributes is essential, while, in practice, these attributes may not be available due to legal and ethical requirements. To address this challenge, this paper studies a model that protects the privacy of the individuals sensitive information while also allowing it to learn non-discriminatory predictors. The method relies on the notion of differential privacy and the use of Lagrangian duality to design neural networks that can accommodate fairness constraints while guaranteeing the privacy of sensitive attributes. The paper analyses the tension between accuracy, privacy, and fairness and the experimental evaluation illustrates the benefits of the proposed model on several prediction tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/17/2022

Stochastic Differentially Private and Fair Learning

Machine learning models are increasingly used in high-stakes decision-ma...
research
05/12/2022

Fair NLP Models with Differentially Private Text Encoders

Encoded text representations often capture sensitive attributes about in...
research
07/18/2022

On Fair Classification with Mostly Private Sensitive Attributes

Machine learning models have demonstrated promising performance in many ...
research
05/29/2019

Fair Decision Making using Privacy-Protected Data

Data collected about individuals is regularly used to make decisions tha...
research
12/07/2020

Improving Fairness and Privacy in Selection Problems

Supervised learning models have been increasingly used for making decisi...
research
10/18/2021

Fair Tree Learning

When dealing with sensitive data in automated data-driven decision-makin...
research
02/26/2020

Fair Learning with Private Demographic Data

Sensitive attributes such as race are rarely available to learners in re...

Please sign up or login with your details

Forgot password? Click here to reset