Average-Case Averages: Private Algorithms for Smooth Sensitivity and Mean Estimation

06/06/2019
by   Mark Bun, et al.
0

The simplest and most widely applied method for guaranteeing differential privacy is to add instance-independent noise to a statistic of interest that is scaled to its global sensitivity. However, global sensitivity is a worst-case notion that is often too conservative for realized dataset instances. We provide methods for scaling noise in an instance-dependent way and demonstrate that they provide greater accuracy under average-case distributional assumptions. Specifically, we consider the basic problem of privately estimating the mean of a real distribution from i.i.d. samples. The standard empirical mean estimator can have arbitrarily-high global sensitivity. We propose the trimmed mean estimator, which interpolates between the mean and the median, as a way of attaining much lower sensitivity on average while losing very little in terms of statistical accuracy. To privately estimate the trimmed mean, we revisit the smooth sensitivity framework of Nissim, Raskhodnikova, and Smith (STOC 2007), which provides a framework for using instance-dependent sensitivity. We propose three new additive noise distributions which provide concentrated differential privacy when scaled to smooth sensitivity. We provide theoretical and experimental evidence showing that our noise distributions compare favorably to others in the literature, in particular, when applied to the mean estimation problem.

READ FULL TEXT
research
06/01/2021

Instance-optimal Mean Estimation Under Differential Privacy

Mean estimation under differential privacy is a fundamental problem, but...
research
04/23/2018

Individual Sensitivity Preprocessing for Data Privacy

The sensitivity metric in differential privacy, which is informally defi...
research
01/30/2023

A Bias-Variance-Privacy Trilemma for Statistical Estimation

The canonical algorithm for differentially private mean estimation is to...
research
12/31/2022

Generalized PTR: User-Friendly Recipes for Data-Adaptive Algorithms with Differential Privacy

The ”Propose-Test-Release” (PTR) framework is a classic recipe for desig...
research
05/16/2020

Near Instance-Optimality in Differential Privacy

We develop two notions of instance optimality in differential privacy, i...
research
10/08/2020

Duff: A Dataset-Distance-Based Utility Function Family for the Exponential Mechanism

We propose and analyze a general-purpose dataset-distance-based utility ...
research
08/16/2023

Optimizing Noise for f-Differential Privacy via Anti-Concentration and Stochastic Dominance

In this paper, we establish anti-concentration inequalities for additive...

Please sign up or login with your details

Forgot password? Click here to reset