Private Hypothesis Selection

05/30/2019
by   Mark Bun, et al.
0

We provide a differentially private algorithm for hypothesis selection. Given samples from an unknown probability distribution P and a set of m probability distributions H, the goal is to output, in a ε-differentially private manner, a distribution from H whose total variation distance to P is comparable to that of the best such distribution (which we denote by α). The sample complexity of our basic algorithm is O( m/α^2 + m/αε), representing a minimal cost for privacy when compared to the non-private algorithm. We also can handle infinite hypothesis classes H by relaxing to (ε,δ)-differential privacy. We apply our hypothesis selection algorithm to give learning algorithms for a number of natural distribution classes, including Gaussians, product distributions, sums of independent random variables, piecewise polynomials, and mixture classes. Our hypothesis selection procedure allows us to generically convert a cover for a class to a learning algorithm, complementing known learning lower bounds which are in terms of the size of the packing number of the class. As the covering and packing numbers are often closely related, for constant α, our algorithms achieve the optimal sample complexity for many classes of interest. Finally, we describe an application to private distribution-free PAC learning.

READ FULL TEXT
research
04/14/2020

Differentially Private Assouad, Fano, and Le Cam

Le Cam's method, Fano's inequality, and Assouad's lemma are three widely...
research
08/17/2021

Statistically Near-Optimal Hypothesis Selection

Hypothesis Selection is a fundamental distribution learning problem wher...
research
10/19/2020

On the Sample Complexity of Privately Learning Unbounded High-Dimensional Gaussians

We provide sample complexity upper bounds for agnostically learning mult...
research
02/21/2020

Locally Private Hypothesis Selection

We initiate the study of hypothesis selection under local differential p...
research
05/01/2018

Privately Learning High-Dimensional Distributions

We design nearly optimal differentially private algorithms for learning ...
research
06/01/2019

Graph-based Discriminators: Sample Complexity and Expressiveness

A basic question in learning theory is to identify if two distributions ...
research
11/07/2021

Sampling from Log-Concave Distributions with Infinity-Distance Guarantees and Applications to Differentially Private Optimization

For a d-dimensional log-concave distribution π(θ)∝ e^-f(θ) on a polytope...

Please sign up or login with your details

Forgot password? Click here to reset