Private Algorithms with Private Predictions

10/20/2022
by   Kareem Amin, et al.
0

When applying differential privacy to sensitive data, a common way of getting improved performance is to use external information such as other sensitive data, public data, or human priors. We propose to use the algorithms with predictions framework – previously applied largely to improve time complexity or competitive ratios – as a powerful way of designing and analyzing privacy-preserving methods that can take advantage of such external information to improve utility. For four important tasks – quantile release, its extension to multiple quantiles, covariance estimation, and data release – we construct prediction-dependent differentially private methods whose utility scales with natural measures of prediction quality. The analyses enjoy several advantages, including minimal assumptions about the data, natural ways of adding robustness to noisy predictions, and novel "meta" algorithms that can learn predictions from other (potentially sensitive) data. Overall, our results demonstrate how to enable differentially private algorithms to make use of and learn noisy predictions, which holds great promise for improving utility while preserving privacy across a variety of tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/25/2019

Distributed Privacy-Preserving Prediction

In privacy-preserving machine learning, individual parties are reluctant...
research
02/16/2021

Differentially Private Quantiles

Quantiles are often used for summarizing and understanding data. If that...
research
09/06/2019

Differentially Private Precision Matrix Estimation

In this paper, we study the problem of precision matrix estimation when ...
research
06/07/2016

Efficient differentially private learning improves drug sensitivity prediction

Users of a personalised recommendation system face a dilemma: recommenda...
research
12/13/2022

Considerations for Differentially Private Learning with Large-Scale Public Pretraining

The performance of differentially private machine learning can be booste...
research
08/26/2022

Epistemic Parity: Reproducibility as an Evaluation Metric for Differential Privacy

Differential privacy mechanisms are increasingly used to enable public r...
research
09/18/2019

VideoDP: A Universal Platform for Video Analytics with Differential Privacy

Massive amounts of video data are ubiquitously generated in personal dev...

Please sign up or login with your details

Forgot password? Click here to reset