Differentially Private Generalized Linear Models Revisited

05/06/2022
by   Raman Arora, et al.
0

We study the problem of (ϵ,δ)-differentially private learning of linear predictors with convex losses. We provide results for two subclasses of loss functions. The first case is when the loss is smooth and non-negative but not necessarily Lipschitz (such as the squared loss). For this case, we establish an upper bound on the excess population risk of Õ(‖ w^*‖/√(n) + min{‖ w^* ‖^2/(nϵ)^2/3,√(d)‖ w^*‖^2/nϵ}), where n is the number of samples, d is the dimension of the problem, and w^* is the minimizer of the population risk. Apart from the dependence on ‖ w^∗‖, our bound is essentially tight in all parameters. In particular, we show a lower bound of Ω̃(1/√(n) + min{‖ w^*‖^4/3/(nϵ)^2/3, √(d)‖ w^*‖/nϵ}). We also revisit the previously studied case of Lipschitz losses [SSTT20]. For this case, we close the gap in the existing work and show that the optimal rate is (up to log factors) Θ(‖ w^*‖/√(n) + min{‖ w^*‖/√(nϵ),√(rank)‖ w^*‖/nϵ}), where rank is the rank of the design matrix. This improves over existing work in the high privacy regime. Finally, our algorithms involve a private model selection approach that we develop to enable attaining the stated rates without a-priori knowledge of ‖ w^*‖.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/12/2021

Differentially Private Stochastic Optimization: New Results in Convex and Non-Convex Settings

We study differentially private stochastic optimization in convex and no...
research
07/20/2023

From Adaptive Query Release to Machine Unlearning

We formalize the problem of machine unlearning as design of efficient un...
research
06/02/2022

Faster Rates of Convergence to Stationary Points in Differentially Private Optimization

We study the problem of approximating stationary points of Lipschitz and...
research
03/04/2021

Remember What You Want to Forget: Algorithms for Machine Unlearning

We study the problem of forgetting datapoints from a learnt model. In th...
research
05/28/2021

Curse of Dimensionality in Unconstrained Private Convex ERM

We consider the lower bounds of differentially private empirical risk mi...
research
04/22/2022

Sharper Utility Bounds for Differentially Private Models

In this paper, by introducing Generalized Bernstein condition, we propos...
research
03/02/2021

Private Stochastic Convex Optimization: Optimal Rates in ℓ_1 Geometry

Stochastic convex optimization over an ℓ_1-bounded domain is ubiquitous ...

Please sign up or login with your details

Forgot password? Click here to reset