Identification and Formal Privacy Guarantees

06/25/2020
by   Tatiana Komarova, et al.
0

Empirical economic research crucially relies on highly sensitive individual datasets. At the same time, increasing availability of public individual-level data makes it possible for adversaries to potentially de-identify anonymized records in sensitive research datasets. This increasing disclosure risk has incentivised large data curators, most notably the US Census bureau and several large companies including Apple, Facebook and Microsoft to look for algorithmic solutions to provide formal non-disclosure guarantees for their secure data. The most commonly accepted formal data security concept in the Computer Science community is differential privacy. It restricts the interaction of researchers with the data by allowing them to issue queries to the data. The differential privacy mechanism then replaces the actual outcome of the query with a randomised outcome. While differential privacy does provide formal data security guarantees, its impact on the identification of empirical economic models and on the performance of estimators in those models has not been sufficiently studied. Since privacy protection mechanisms are inherently finite-sample procedures, we define the notion of identifiability of the parameter of interest as a property of the limit of experiments. It is linked to the asymptotic behavior in measure of differentially private estimators. We demonstrate that particular instances of regression discontinuity design and average treatment effect may be problematic for inference with differential privacy because their estimators can only be ensured to converge weakly with their asymptotic limit remaining random and, thus, may not be estimated consistently. This result is clearly supported by our simulation evidence. Our analysis suggests that many other estimators that rely on nuisance parameters may have similar properties with the requirement of differential privacy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/22/2019

Privacy-preserving parametric inference: a case for robust statistics

Differential privacy is a cryptographically-motivated approach to privac...
research
03/07/2022

Quantum Local Differential Privacy and Quantum Statistical Query Model

The problem of private learning has been extensively studied in classica...
research
06/24/2021

Bayesian Differential Privacy for Linear Dynamical Systems

Differential privacy is a privacy measure based on the difficulty of dis...
research
10/25/2022

Differentially Private Language Models for Secure Data Sharing

To protect the privacy of individuals whose data is being shared, it is ...
research
08/17/2018

Cardinality Estimators do not Preserve Privacy

Cardinality estimators like HyperLogLog are sketching algorithms that es...
research
02/10/2020

Guidelines for Implementing and Auditing Differentially Private Systems

Differential privacy is an information theoretic constraint on algorithm...
research
05/24/2023

Can Copyright be Reduced to Privacy?

There is an increasing concern that generative AI models may produce out...

Please sign up or login with your details

Forgot password? Click here to reset