Noiseless Privacy

10/29/2019
by   Farhad Farokhi, et al.
0

In this paper, we define noiseless privacy, as a non-stochastic rival to differential privacy, requiring that the outputs of a mechanism (i.e., function composition of a privacy-preserving mapping and a query) can attain only a few values while varying the data of an individual (the logarithm of the number of the distinct values is bounded by the privacy budget). Therefore, the output of the mechanism is not fully informative of the data of the individuals in the dataset. We prove several guarantees for noiselessly-private mechanisms. The information content of the output about the data of an individual, even if an adversary knows all the other entries of the private dataset, is bounded by the privacy budget. The zero-error capacity of memory-less channels using noiselessly private mechanisms for transmission is upper bounded by the privacy budget. The performance of a non-stochastic hypothesis-testing adversary is bounded again by the privacy budget. Finally, assuming that an adversary has access to a stochastic prior on the dataset, we prove that the estimation error of the adversary for individual entries of the dataset is lower bounded by a decreasing function of the privacy budget. In this case, we also show that the maximal information leakage is bounded by the privacy budget. In addition to privacy guarantees, we prove that noiselessly-private mechanisms admit composition theorem and post-processing does not weaken their privacy guarantees. We prove that quantization operators can ensure noiseless privacy if the number of quantization levels is appropriately selected based on the sensitivity of the query and the privacy budget. Finally, we illustrate the privacy merits of noiseless privacy using multiple datasets in energy and transport.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/07/2020

A bounded-noise mechanism for differential privacy

Answering multiple counting queries is one of the best-studied problems ...
research
07/22/2019

On the Information Privacy Model: the Group and Composition Privacy

How to query a dataset in the way of preserving the privacy of individua...
research
10/20/2020

Non-Stochastic Private Function Evaluation

We consider private function evaluation to provide query responses based...
research
10/26/2018

Development and Analysis of Deterministic Privacy-Preserving Policies Using Non-Stochastic Information Theory

A non-stochastic privacy metric using non-stochastic information theory ...
research
03/02/2020

Differential Privacy at Risk: Bridging Randomness and Privacy Budget

The calibration of noise for a privacy-preserving mechanism depends on t...
research
10/21/2019

Constructing Privacy Channels from Information Channels

Data privacy protection studies how to query a dataset while preserving ...
research
05/29/2019

Privacy Amplification by Mixing and Diffusion Mechanisms

A fundamental result in differential privacy states that the privacy gua...

Please sign up or login with your details

Forgot password? Click here to reset