Tight Lower Bounds for Locally Differentially Private Selection

02/07/2018
by   Jonathan Ullman, et al.
0

We prove a tight lower bound (up to constant factors) on the sample complexity of any non-interactive local differentially private protocol for optimizing a linear function over the simplex. This lower bound also implies a tight lower bound (again, up to constant factors) on the sample complexity of any non-interactive local differentially private protocol implementing the exponential mechanism. These results reveal that any local protocol for these problems has exponentially worse dependence on the dimension than corresponding algorithms in the central model. Previously, Kasiviswanathan et al. (FOCS 2008) proved an exponential separation between local and central model algorithms for PAC learning the class of parity functions. In contrast, our lower bound are quantitatively tight, apply to a simple and natural class of linear optimization problems, and our techniques are arguably simpler.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset