DeepAI AI Chat
Log In Sign Up

Differentially Private Accelerated Optimization Algorithms

08/05/2020
by   Nurdan Kuru, et al.
Erasmus University Rotterdam
Rutgers University
Sabancı University
0

We present two classes of differentially private optimization algorithms derived from the well-known accelerated first-order methods. The first algorithm is inspired by Polyak's heavy ball method and employs a smoothing approach to decrease the accumulated noise on the gradient steps required for differential privacy. The second class of algorithms are based on Nesterov's accelerated gradient method and its recent multi-stage variant. We propose a noise dividing mechanism for the iterations of Nesterov's method in order to improve the error behavior of the algorithm. The convergence rate analyses are provided for both the heavy ball and the Nesterov's accelerated gradient method with the help of the dynamical system analysis techniques. Finally, we conclude with our numerical experiments showing that the presented algorithms have advantages over the well-known differentially private algorithms.

READ FULL TEXT

page 1

page 2

page 3

page 4

02/09/2023

Differentially Private Optimization for Smooth Nonconvex ERM

We develop simple differentially private optimization algorithms that mo...
10/14/2021

Adaptive Differentially Private Empirical Risk Minimization

We propose an adaptive (stochastic) gradient perturbation method for dif...
05/06/2015

Fast Differentially Private Matrix Factorization

Differentially private collaborative filtering is a challenging task, bo...
05/31/2018

On Acceleration with Noise-Corrupted Gradients

Accelerated algorithms have broad applications in large-scale optimizati...
01/19/2021

On Dynamic Noise Influence in Differentially Private Learning

Protecting privacy in learning while maintaining the model performance h...
10/31/2018

Provably Accelerated Randomized Gossip Algorithms

In this work we present novel provably accelerated gossip algorithms for...