Dimension Independence in Unconstrained Private ERM via Adaptive Preconditioning

08/14/2020
by   Peter Kairouz, et al.
0

In this paper we revisit the problem of private empirical risk minimziation (ERM) with differential privacy. We show that for unconstrained convex empirical risk minimization if the observed gradients of the objective function along the path of private gradient descent lie in a low-dimensional subspace (smaller than the ambient dimensionality of p), then using noisy adaptive preconditioning (a.k.a., noisy Adaptive Gradient Descent (AdaGrad)) we obtain a regret composed of two terms: a constant multiplicative factor of the original AdaGrad regret and an additional regret due to noise. In particular, we show that if the gradients lie in a constant rank subspace, then one can achieve an excess empirical risk of Õ(1/ϵ n), compared to the worst-case achievable bound of Õ(√(p)/ϵ n). While previous works show dimension independent excess empirical risk bounds for the restrictive setting of convex generalized linear problems optimized over unconstrained subspaces, our results operate with general convex functions in unconstrained minimization. Along the way, we do a perturbation analysis of noisy AdaGrad, which may be of independent interest.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/28/2021

Curse of Dimensionality in Unconstrained Private Convex ERM

We consider the lower bounds of differentially private empirical risk mi...
research
05/28/2021

Privately Learning Subspaces

Private data analysis suffers a costly curse of dimensionality. However,...
research
07/07/2020

Bypassing the Ambient Dimension: Private SGD with Gradient Subspace Identification

Differentially private SGD (DP-SGD) is one of the most popular methods f...
research
01/18/2021

On the Differentially Private Nature of Perturbed Gradient Descent

We consider the problem of empirical risk minimization given a database,...
research
08/29/2023

Limited memory gradient methods for unconstrained optimization

The limited memory steepest descent method (Fletcher, 2012) for unconstr...
research
01/04/2017

Private Incremental Regression

Data is continuously generated by modern data sources, and a recent chal...
research
03/02/2023

Choosing Public Datasets for Private Machine Learning via Gradient Subspace Distance

Differentially private stochastic gradient descent privatizes model trai...

Please sign up or login with your details

Forgot password? Click here to reset