The Laplace Mechanism has optimal utility for differential privacy over continuous queries

by   Natasha Fernandes, et al.

Differential Privacy protects individuals' data when statistical queries are published from aggregated databases: applying "obfuscating" mechanisms to the query results makes the released information less specific but, unavoidably, also decreases its utility. Yet it has been shown that for discrete data (e.g. counting queries), a mandated degree of privacy and a reasonable interpretation of loss of utility, the Geometric obfuscating mechanism is optimal: it loses as little utility as possible. For continuous query results however (e.g. real numbers) the optimality result does not hold. Our contribution here is to show that optimality is regained by using the Laplace mechanism for the obfuscation. The technical apparatus involved includes the earlier discrete result by Ghosh et al., recent work on abstract channels and their geometric representation as hyper-distributions, and the dual interpretations of distance between distributions provided by the Kantorovich-Rubinstein Theorem.


page 1

page 2

page 3

page 4


Universal Optimality and Robust Utility Bounds for Metric Differential Privacy

We study the privacy-utility trade-off in the context of metric differen...

Unexpected Information Leakage of Differential Privacy Due to Linear Property of Queries

The differential privacy is a widely accepted conception of privacy pres...

Differential Privacy Via a Truncated and Normalized Laplace Mechanism

When querying databases containing sensitive information, the privacy of...

A Differential Privacy Mechanism Design Under Matrix-Valued Query

Traditionally, differential privacy mechanism design has been tailored f...

Chorus: Differential Privacy via Query Rewriting

We present Chorus, a system with a novel architecture for providing diff...

Constructing Privacy Channels from Information Channels

Data privacy protection studies how to query a dataset while preserving ...