Removing Disparate Impact of Differentially Private Stochastic Gradient Descent on Model Accuracy

03/08/2020
by   Depeng Xu, et al.
0

When we enforce differential privacy in machine learning, the utility-privacy trade-off is different w.r.t. each group. Gradient clipping and random noise addition disproportionately affect underrepresented and complex classes and subgroups, which results in inequality in utility loss. In this work, we analyze the inequality in utility loss by differential privacy and propose a modified differentially private stochastic gradient descent (DPSGD), called DPSGD-F, to remove the potential disparate impact of differential privacy on the protected group. DPSGD-F adjusts the contribution of samples in a group depending on the group clipping bias such that differential privacy has no disparate impact on group utility. Our experimental evaluation shows how group sample size and group clipping bias affect the impact of differential privacy in DPSGD, and how adaptive clipping for each group helps to mitigate the disparate impact caused by differential privacy in DPSGD-F.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/11/2021

Can Stochastic Gradient Langevin Dynamics Provide Differential Privacy for Deep Learning?

Bayesian learning via Stochastic Gradient Langevin Dynamics (SGLD) has b...
research
03/05/2022

The Impact of Differential Privacy on Group Disparity Mitigation

The performance cost of differential privacy has, for some applications,...
research
06/15/2022

Disparate Impact in Differential Privacy from Gradient Misalignment

As machine learning becomes more widespread throughout society, aspects ...
research
11/01/2022

On the Interaction Between Differential Privacy and Gradient Compression in Deep Learning

While differential privacy and gradient compression are separately well-...
research
07/12/2021

Improving the Algorithm of Deep Learning with Differential Privacy

In this paper, an adjustment to the original differentially private stoc...
research
12/25/2017

On Connecting Stochastic Gradient MCMC and Differential Privacy

Significant success has been realized recently on applying machine learn...
research
06/07/2019

Computing Exact Guarantees for Differential Privacy

Quantification of the privacy loss associated with a randomised algorith...

Please sign up or login with your details

Forgot password? Click here to reset