How to Use Heuristics for Differential Privacy

11/19/2018
by   Seth Neel, et al.
0

We develop theory for using heuristics to solve computationally hard problems in differential privacy. Heuristic approaches have enjoyed tremendous success in machine learning, for which performance can be empirically evaluated. However, privacy guarantees cannot be evaluated empirically, and must be proven --- without making heuristic assumptions. We show that learning problems over broad classes of functions can be solved privately and efficiently, assuming the existence of a non-private oracle for solving the same problem. Our first algorithm yields a privacy guarantee that is contingent on the correctness of the oracle. We then give a reduction which applies to a class of heuristics which we call certifiable, which allows us to convert oracle-dependent privacy guarantees to worst-case privacy guarantee that hold even when the heuristic standing in for the oracle might fail in adversarial ways. Finally, we consider a broad class of functions that includes most classes of simple boolean functions studied in the PAC learning literature, including conjunctions, disjunctions, parities, and discrete halfspaces. We show that there is an efficient algorithm for privately constructing synthetic data for any such class, given a non-private learning oracle. This in particular gives the first oracle-efficient algorithm for privately generating synthetic data for contingency tables. The most intriguing question left open by our work is whether or not every problem that can be solved differentially privately can be privately solved with an oracle-efficient algorithm. While we do not resolve this, we give a barrier result that suggests that any generic oracle-efficient reduction must fall outside of a natural class of algorithms (which includes the algorithms given in this paper).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/10/2020

New Oracle-Efficient Algorithms for Private Synthetic Data Release

We present three new algorithms for constructing differentially private ...
research
02/23/2015

Learning with Differential Privacy: Stability, Learnability and the Sufficiency and Necessity of ERM Principle

While machine learning has proven to be a powerful data-driven solution ...
research
07/18/2023

Oracle Efficient Online Multicalibration and Omniprediction

A recent line of work has shown a surprising connection between multical...
research
04/20/2022

Private measures, random walks, and synthetic data

Differential privacy is a mathematical concept that provides an informat...
research
02/15/2023

Tight Auditing of Differentially Private Machine Learning

Auditing mechanisms for differential privacy use probabilistic means to ...
research
03/27/2018

Privacy-preserving Prediction

Ensuring differential privacy of models learned from sensitive user data...
research
01/19/2022

On Heuristic Models, Assumptions, and Parameters

Study of the interaction between computation and society often focuses o...

Please sign up or login with your details

Forgot password? Click here to reset