Privacy-Preserving Synthetic Datasets Over Weakly Constrained Domains

08/23/2018
by   Luke Rodriguez, et al.
0

Techniques to deliver privacy-preserving synthetic datasets take a sensitive dataset as input and produce a similar dataset as output while maintaining differential privacy. These approaches have the potential to improve data sharing and reuse, but they must be accessible to non-experts and tolerant of realistic data. Existing approaches make an implicit assumption that the active domain of the dataset is similar to the global domain, potentially violating differential privacy. In this paper, we present an algorithm for generating differentially private synthetic data over the large, weakly constrained domains we find in realistic open data situations. Our algorithm models the unrepresented domain analytically as a probability distribution to adjust the output and compute noise, avoiding the need to compute the full domain explicitly. We formulate the tradeoff between privacy and utility in terms of a "tolerance for randomness" parameter that does not require users to inspect the data to set. Finally, we show that the algorithm produces sensible results on real datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/10/2019

Privacy-preserving data sharing via probabilistic modelling

Differential privacy allows quantifying privacy loss from computations o...
research
07/04/2023

Approximate, Adapt, Anonymize (3A): a Framework for Privacy Preserving Training Data Release for Machine Learning

The availability of large amounts of informative data is crucial for suc...
research
05/28/2022

MC-GEN:Multi-level Clustering for Private Synthetic Data Generation

Nowadays, machine learning is one of the most common technology to turn ...
research
12/13/2022

Considerations for Differentially Private Learning with Large-Scale Public Pretraining

The performance of differentially private machine learning can be booste...
research
08/21/2020

Privacy Preserving Recalibration under Domain Shift

Classifiers deployed in high-stakes real-world applications must output ...
research
07/13/2023

To share or not to share: What risks would laypeople accept to give sensitive data to differentially-private NLP systems?

Although the NLP community has adopted central differential privacy as a...
research
09/22/2018

Understanding Tor Usage with Privacy-Preserving Measurement

The Tor anonymity network is difficult to measure because, if not done c...

Please sign up or login with your details

Forgot password? Click here to reset