Private Empirical Risk Minimization Beyond the Worst Case: The Effect of the Constraint Set Geometry

by   Kunal Talwar, et al.

Empirical Risk Minimization (ERM) is a standard technique in machine learning, where a model is selected by minimizing a loss function over constraint set. When the training dataset consists of private information, it is natural to use a differentially private ERM algorithm, and this problem has been the subject of a long line of work started with Chaudhuri and Monteleoni 2008. A private ERM algorithm outputs an approximate minimizer of the loss function and its error can be measured as the difference from the optimal value of the loss function. When the constraint set is arbitrary, the required error bounds are fairly well understood BassilyST14. In this work, we show that the geometric properties of the constraint set can be used to derive significantly better results. Specifically, we show that a differentially private version of Mirror Descent leads to error bounds of the form Õ(G_C/n) for a lipschitz loss function, improving on the Õ(√(p)/n) bounds in Bassily, Smith and Thakurta 2014. Here p is the dimensionality of the problem, n is the number of data points in the training set, and G_C denotes the Gaussian width of the constraint set that we optimize over. We show similar improvements for strongly convex functions, and for smooth functions. In addition, we show that when the loss function is Lipschitz with respect to the ℓ_1 norm and C is ℓ_1-bounded, a differentially private version of the Frank-Wolfe algorithm gives error bounds of the form Õ(n^-2/3). This captures the important and common case of sparse linear regression (LASSO), when the data x_i satisfies |x_i|_∞≤ 1 and we optimize over the ℓ_1 ball. We show new lower bounds for this setting, that together with known bounds, imply that all our upper bounds are tight.


page 1

page 2

page 3

page 4


Differentially Private Empirical Risk Minimization: Efficient Algorithms and Tight Error Bounds

In this paper, we initiate a systematic investigation of differentially ...

Private Incremental Regression

Data is continuously generated by modern data sources, and a recent chal...

Differentially Private Empirical Risk Minimization Revisited: Faster and More General

In this paper we study the differentially private Empirical Risk Minimiz...

Differentially Private Algorithms for the Stochastic Saddle Point Problem with Optimal Rates for the Strong Gap

We show that convex-concave Lipschitz stochastic saddle point problems (...

High Dimensional Differentially Private Stochastic Optimization with Heavy-tailed Data

As one of the most fundamental problems in machine learning, statistics ...

Empirical Risk Minimization in the Non-interactive Local Model of Differential Privacy

In this paper, we study the Empirical Risk Minimization (ERM) problem in...

Please sign up or login with your details

Forgot password? Click here to reset