The Impact of Differential Privacy on Group Disparity Mitigation

The performance cost of differential privacy has, for some applications, been shown to be higher for minority groups; fairness, conversely, has been shown to disproportionally compromise the privacy of members of such groups. Most work in this area has been restricted to computer vision and risk assessment. In this paper, we evaluate the impact of differential privacy on fairness across four tasks, focusing on how attempts to mitigate privacy violations and between-group performance differences interact: Does privacy inhibit attempts to ensure fairness? To this end, we train (ε,δ)-differentially private models with empirical risk minimization and group distributionally robust training objectives. Consistent with previous findings, we find that differential privacy increases between-group performance differences in the baseline setting; but more interestingly, differential privacy reduces between-group performance differences in the robust setting. We explain this by reinterpreting differential privacy as regularization.

READ FULL TEXT

page 6

page 13

research
03/08/2020

Removing Disparate Impact of Differentially Private Stochastic Gradient Descent on Model Accuracy

When we enforce differential privacy in machine learning, the utility-pr...
research
10/28/2022

Fairness Certificates for Differentially Private Classification

In this work, we theoretically study the impact of differential privacy ...
research
05/24/2023

Can Copyright be Reduced to Privacy?

There is an increasing concern that generative AI models may produce out...
research
11/29/2021

Architecture Matters: Investigating the Influence of Differential Privacy on Neural Network Design

One barrier to more widespread adoption of differentially private neural...
research
04/01/2020

Differential Privacy for Sequential Algorithms

We study the differential privacy of sequential statistical inference an...
research
04/25/2023

(Local) Differential Privacy has NO Disparate Impact on Fairness

In recent years, Local Differential Privacy (LDP), a robust privacy-pres...
research
01/24/2020

Privacy for All: Demystify Vulnerability Disparity of Differential Privacy against Membership Inference Attack

Machine learning algorithms, when applied to sensitive data, pose a pote...

Please sign up or login with your details

Forgot password? Click here to reset