Private Distribution Learning with Public Data: The View from Sample Compression

08/11/2023
by   Shai Ben-David, et al.
0

We study the problem of private distribution learning with access to public data. In this setup, which we refer to as public-private learning, the learner is given public and private samples drawn from an unknown distribution p belonging to a class 𝒬, with the goal of outputting an estimate of p while adhering to privacy constraints (here, pure differential privacy) only with respect to the private samples. We show that the public-private learnability of a class 𝒬 is connected to the existence of a sample compression scheme for 𝒬, as well as to an intermediate notion we refer to as list learning. Leveraging this connection: (1) approximately recovers previous results on Gaussians over ℝ^d; and (2) leads to new ones, including sample complexity upper bounds for arbitrary k-mixtures of Gaussians over ℝ^d, results for agnostic and distribution-shift resistant learners, as well as closure properties for public-private learnability under taking mixtures and products of distributions. Finally, via the connection to list learning, we show that for Gaussians in ℝ^d, at least d public samples are necessary for private learnability, which is close to the known upper bound of d+1 public samples.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset