Bisimilarity Distances for Approximate Differential Privacy

07/26/2018
by   Dmitry Chistikov, et al.
0

Differential privacy is a widely studied notion of privacy for various models of computation. Technically, it is based on measuring differences between probability distributions. We study ϵ,δ-differential privacy in the setting of labelled Markov chains. While the exact differences relevant to ϵ,δ-differential privacy are not computable in this framework, we propose a computable bisimilarity distance that yields a sound technique for measuring δ, the parameter that quantifies deviation from pure differential privacy. We show this bisimilarity distance is always rational, the associated threshold problem is in NP, and the distance can be computed exactly with polynomially many calls to an NP oracle.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/03/2019

Capacity Bounded Differential Privacy

Differential privacy, a notion of algorithmic stability, is a gold stand...
research
10/24/2017

Reasoning about Divergences for Relaxations of Differential Privacy

We develop a semantics framework for verifying recent relaxations of dif...
research
11/15/2019

Separating Local Shuffled Differential Privacy via Histograms

Recent work in differential privacy has highlighted the shuffled model a...
research
01/24/2021

A Linear Reduction Method for Local Differential Privacy and Log-lift

This paper considers the problem of publishing data X while protecting c...
research
08/04/2020

Verifying Pufferfish Privacy in Hidden Markov Models

Pufferfish is a Bayesian privacy framework for designing and analyzing p...
research
02/21/2020

Privately Learning Markov Random Fields

We consider the problem of learning Markov Random Fields (including the ...
research
03/12/2012

Differential Privacy for Functions and Functional Data

Differential privacy is a framework for privately releasing summaries of...

Please sign up or login with your details

Forgot password? Click here to reset