Differentially Private Propensity Scores for Bias Correction

10/05/2022
by   Liangwei Chen, et al.
0

In surveys, it is typically up to the individuals to decide if they want to participate or not, which leads to participation bias: the individuals willing to share their data might not be representative of the entire population. Similarly, there are cases where one does not have direct access to any data of the target population and has to resort to publicly available proxy data sampled from a different distribution. In this paper, we present Differentially Private Propensity Scores for Bias Correction (DiPPS), a method for approximating the true data distribution of interest in both of the above settings. We assume that the data analyst has access to a dataset D̃ that was sampled from the distribution of interest in a biased way. As individuals may be more willing to share their data when given a privacy guarantee, we further assume that the analyst is allowed locally differentially private access to a set of samples D from the true, unbiased distribution. Each data point from the private, unbiased dataset D is mapped to a probability distribution over clusters (learned from the biased dataset D̃), from which a single cluster is sampled via the exponential mechanism and shared with the data analyst. This way, the analyst gathers a distribution over clusters, which they use to compute propensity scores for the points in the biased D̃, which are in turn used to reweight the points in D̃ to approximate the true data distribution. It is now possible to compute any function on the resulting reweighted dataset without further access to the private D. In experiments on datasets from various domains, we show that DiPPS successfully brings the distribution of the available dataset closer to the distribution of interest in terms of Wasserstein distance. We further show that this results in improved estimates for different statistics.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/16/2021

Differentially Private Quantiles

Quantiles are often used for summarizing and understanding data. If that...
research
08/24/2021

Bias Mitigated Learning from Differentially Private Synthetic Data: A Cautionary Tale

Increasing interest in privacy-preserving machine learning has led to ne...
research
04/08/2020

Differentially Private Optimal Power Flow for Distribution Grids

Although distribution grid customers are obliged to share their consumpt...
research
01/31/2020

Efficient Differentially Private F_0 Linear Sketching

A powerful feature of linear sketches is that from sketches of two data ...
research
01/30/2023

A Bias-Variance-Privacy Trilemma for Statistical Estimation

The canonical algorithm for differentially private mean estimation is to...
research
10/27/2020

Faster Differentially Private Samplers via Rényi Divergence Analysis of Discretized Langevin MCMC

Various differentially private algorithms instantiate the exponential me...
research
12/05/2021

A Robust, Differentially Private Randomized Experiment for Evaluating Online Educational Programs With Sensitive Student Data

Randomized control trials (RCTs) have been the gold standard to evaluate...

Please sign up or login with your details

Forgot password? Click here to reset