DeepAI
Log In Sign Up

Momentum Aggregation for Private Non-convex ERM

10/12/2022
by   Hoang Tran, et al.
0

We introduce new algorithms and convergence guarantees for privacy-preserving non-convex Empirical Risk Minimization (ERM) on smooth d-dimensional objectives. We develop an improved sensitivity analysis of stochastic gradient descent on smooth objectives that exploits the recurrence of examples in different epochs. By combining this new approach with recent analysis of momentum with private aggregation techniques, we provide an (ϵ,δ)-differential private algorithm that finds a gradient of norm Õ(d^1/3/(ϵ N)^2/3) in O(N^7/3ϵ^4/3/d^2/3) gradient evaluations, improving the previous best gradient bound of Õ(d^1/4/√(ϵ N)).

READ FULL TEXT

page 1

page 2

page 3

page 4

03/29/2017

Efficient Private ERM for Smooth Objectives

In this paper, we consider efficient differentially private empirical ri...
03/17/2016

Variance Reduction for Faster Non-Convex Optimization

We consider the fundamental problem in non-convex optimization of effici...
05/24/2019

Momentum-Based Variance Reduction in Non-Convex SGD

Variance reduction has emerged in recent years as a strong competitor to...
01/18/2021

On the Differentially Private Nature of Perturbed Gradient Descent

We consider the problem of empirical risk minimization given a database,...
11/30/2021

Survey Descent: A Multipoint Generalization of Gradient Descent for Nonsmooth Optimization

For strongly convex objectives that are smooth, the classical theory of ...
09/14/2020

Effective Proximal Methods for Non-convex Non-smooth Regularized Learning

Sparse learning is a very important tool for mining useful information a...