Curse of Dimensionality in Unconstrained Private Convex ERM

05/28/2021
by   Daogao Liu, et al.
0

We consider the lower bounds of differentially private empirical risk minimization for general convex functions in this paper. For convex generalized linear models (GLMs), the well-known tight bound of DP-ERM in the constrained case is Θ̃(√(p)/ϵ n), while recently, <cit.> find the tight bound of DP-ERM in the unconstrained case is Θ̃(√(rank)/ϵ n) where p is the dimension, n is the sample size and rank is the rank of the feature matrix of the GLM objective function. As rank≤min{n,p}, a natural and important question arises that whether we can evade the curse of dimensionality for over-parameterized models where n≪ p, for more general convex functions beyond GLM. We answer this question negatively by giving the first and tight lower bound of unconstrained private ERM for the general convex function, matching the current upper bound Õ(√(p)/nϵ) for unconstrained private ERM. We also give an Ω(p/nϵ) lower bound for unconstrained pure-DP ERM which recovers the result in the constrained case.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset