Private Non-smooth Empirical Risk Minimization and Stochastic Convex Optimization in Subquadratic Steps

03/29/2021 ∙ by Janardhan Kulkarni, et al. ∙ 0

We study the differentially private Empirical Risk Minimization (ERM) and Stochastic Convex Optimization (SCO) problems for non-smooth convex functions. We get a (nearly) optimal bound on the excess empirical risk and excess population loss with subquadratic gradient complexity. More precisely, our differentially private algorithm requires O(N^3/2/d^1/8+ N^2/d) gradient queries for optimal excess empirical risk, which is achieved with the help of subsampling and smoothing the function via convolution. This is the first subquadratic algorithm for the non-smooth case when d is super constant. As a direct application, using the iterative localization approach of Feldman et al. <cit.>, we achieve the optimal excess population loss for stochastic convex optimization problem, with O(min{N^5/4d^1/8, N^3/2/d^1/8}) gradient queries. Our work makes progress towards resolving a question raised by Bassily et al. <cit.>, giving first algorithms for private ERM and SCO with subquadratic steps. We note that independently Asi et al. <cit.> gave other algorithms for private ERM and SCO with subquadratic steps.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.