DeepAI AI Chat
Log In Sign Up

The Privacy-preserving Padding Problem: Non-negative Mechanisms for Conservative Answers with Differential Privacy

by   Benjamin M. Case, et al.

Differentially private noise mechanisms commonly use symmetric noise distributions. This is attractive both for achieving the differential privacy definition, and for unbiased expectations in the noised answers. However, there are contexts in which a noisy answer only has utility if it is conservative, that is, has known-signed error, which we call a padded answer. Seemingly, it is paradoxical to satisfy the DP definition with one-sided error, but we show how it is possible to bury the paradox into approximate DP's delta parameter. We develop a few mechanisms for one-sided padding mechanisms that always give conservative answers, but still achieve approximate differential privacy. We show how these mechanisms can be applied in a few select areas including making the cardinalities of set intersections and unions revealed in Private Set Intersection protocols differential private and enabling multiparty computation protocols to compute on sparse data which has its exact sizes made differential private rather than performing a fully oblivious more expensive computation.


page 1

page 2

page 3

page 4


Flexible Accuracy for Differential Privacy

Differential Privacy (DP) has become a gold standard in privacy-preservi...

Gaussian Noise is Nearly Instance Optimal for Private Unbiased Mean Estimation

We investigate unbiased high-dimensional mean estimators in differential...

Privacy-Preserving Mechanisms for Parametric Survival Analysis with Weibull Distribution

Survival analysis studies the statistical properties of the time until a...

A Formal Privacy Framework for Partially Private Data

Despite its many useful theoretical properties, differential privacy (DP...

On Linear Time Decidability of Differential Privacy for Programs with Unbounded Inputs

We introduce an automata model for describing interesting classes of dif...

Adaptive Privacy Composition for Accuracy-first Mechanisms

In many practical applications of differential privacy, practitioners se...

Label differential privacy via clustering

We present new mechanisms for label differential privacy, a relaxation o...