Evaluating the Impact of Local Differential Privacy on Utility Loss via Influence Functions

09/15/2023
by   Alycia N. Carey, et al.
0

How to properly set the privacy parameter in differential privacy (DP) has been an open question in DP research since it was first proposed in 2006. In this work, we demonstrate the ability of influence functions to offer insight into how a specific privacy parameter value will affect a model's test loss in the randomized response-based local DP setting. Our proposed method allows a data curator to select the privacy parameter best aligned with their allowed privacy-utility trade-off without requiring heavy computation such as extensive model retraining and data privatization. We consider multiple common randomization scenarios, such as performing randomized response over the features, and/or over the labels, as well as the more complex case of applying a class-dependent label noise correction method to offset the noise incurred by randomization. Further, we provide a detailed discussion over the computational complexity of our proposed approach inclusive of an empirical analysis. Through empirical evaluations we show that for both binary and multi-class settings, influence functions are able to approximate the true change in test loss that occurs when randomized response is applied over features and/or labels with small mean absolute error, especially in cases where noise correction methods are applied.

READ FULL TEXT
research
09/03/2022

Randomized Privacy Budget Differential Privacy

While pursuing better utility by discovering knowledge from the data, in...
research
11/10/2022

Heterogeneous Randomized Response for Differential Privacy in Graph Neural Networks

Graph neural networks (GNNs) are susceptible to privacy inference attack...
research
05/18/2022

Tight Differential Privacy Guarantees for the Shuffle Model with k-Randomized Response

Most differentially private (DP) algorithms assume a central model in wh...
research
03/06/2018

Connecting Randomized Response, Post-Randomization, Differential Privacy and t-Closeness via Deniability and Permutation

We explore some novel connections between the main privacy models in use...
research
09/03/2022

Age-Dependent Differential Privacy

The proliferation of real-time applications has motivated extensive rese...
research
06/07/2021

Antipodes of Label Differential Privacy: PATE and ALIBI

We consider the privacy-preserving machine learning (ML) setting where t...
research
03/11/2018

A simple algorithm for estimating distribution parameters from n-dimensional randomized binary responses

Randomized response for privacy protection is attractive as provided dis...

Please sign up or login with your details

Forgot password? Click here to reset