Differentially Private Data Publication with Multi-level Data Utility

12/13/2021
by   Honglu Jiang, et al.
0

Conventional private data publication mechanisms aim to retain as much data utility as possible while ensuring sufficient privacy protection on sensitive data. Such data publication schemes implicitly assume that all data analysts and users have the same data access privilege levels. However, it is not applicable for the scenario that data users often have different levels of access to the same data, or different requirements of data utility. The multi-level privacy requirements for different authorization levels pose new challenges for private data publication. Traditional PPDP mechanisms only publish one perturbed and private data copy satisfying some privacy guarantee to provide relatively accurate analysis results. To find a good tradeoff between privacy preservation level and data utility itself is a hard problem, let alone achieving multi-level data utility on this basis. In this paper, we address this challenge in proposing a novel framework of data publication with compressive sensing supporting multi-level utility-privacy tradeoffs, which provides differential privacy. Specifically, we resort to compressive sensing (CS) method to project a n-dimensional vector representation of users' data to a lower m-dimensional space, and then add deliberately designed noise to satisfy differential privacy. Then, we selectively obfuscate the measurement vector under compressive sensing by adding linearly encoded noise, and provide different data reconstruction algorithms for users with different authorization levels. Extensive experimental results demonstrate that ML-DPCS yields multi-level of data utility for specific users at different authorization levels.

READ FULL TEXT
research
09/10/2020

Neither Private Nor Fair: Impact of Data Imbalance on Utility and Fairness in Differential Privacy

Deployment of deep learning in different fields and industries is growin...
research
12/10/2020

Research Challenges in Designing Differentially Private Text Generation Mechanisms

Accurately learning from user data while ensuring quantifiable privacy g...
research
08/24/2022

DP2-Pub: Differentially Private High-Dimensional Data Publication with Invariant Post Randomization

A large amount of high-dimensional and heterogeneous data appear in prac...
research
05/28/2023

Training Private Models That Know What They Don't Know

Training reliable deep learning models which avoid making overconfident ...
research
08/19/2013

Incentives for Privacy Tradeoff in Community Sensing

Community sensing, fusing information from populations of privately-held...
research
09/18/2019

VideoDP: A Universal Platform for Video Analytics with Differential Privacy

Massive amounts of video data are ubiquitously generated in personal dev...
research
04/12/2021

Multi-level reversible encryption for ECG signals using compressive sensing

Privacy concerns in healthcare have gained interest recently via GDPR, w...

Please sign up or login with your details

Forgot password? Click here to reset