Decentralized Nonconvex Optimization with Guaranteed Privacy and Accuracy

12/14/2022
by   Yongqiang Wang, et al.
0

Privacy protection and nonconvexity are two challenging problems in decentralized optimization and learning involving sensitive data. Despite some recent advances addressing each of the two problems separately, no results have been reported that have theoretical guarantees on both privacy protection and saddle/maximum avoidance in decentralized nonconvex optimization. We propose a new algorithm for decentralized nonconvex optimization that can enable both rigorous differential privacy and saddle/maximum avoiding performance. The new algorithm allows the incorporation of persistent additive noise to enable rigorous differential privacy for data samples, gradients, and intermediate optimization variables without losing provable convergence, and thus circumventing the dilemma of trading accuracy for privacy in differential privacy design. More interestingly, the algorithm is theoretically proven to be able to efficiently guarantee accuracy by avoiding convergence to local maxima and saddle points, which has not been reported before in the literature on decentralized nonconvex optimization. The algorithm is efficient in both communication (it only shares one variable in each iteration) and computation (it is encryption-free), and hence is promising for large-scale nonconvex optimization and learning involving high-dimensional optimization parameters. Numerical experiments for both a decentralized estimation problem and an Independent Component Analysis (ICA) problem confirm the effectiveness of the proposed approach.

READ FULL TEXT
research
08/07/2022

Quantization enabled Privacy Protection in Decentralized Stochastic Optimization

By enabling multiple agents to cooperatively solve a global optimization...
research
05/08/2022

Decentralized Stochastic Optimization with Inherent Privacy Protection

Decentralized stochastic optimization is the basic building block of mod...
research
05/17/2023

Convergence and Privacy of Decentralized Nonconvex Optimization with Gradient Clipping and Communication Compression

Achieving communication efficiency in decentralized machine learning has...
research
10/28/2022

Ensure Differential Privacy and Convergence Accuracy in Consensus Tracking and Aggregative Games with Coupling Constraints

We address differential privacy for fully distributed aggregative games ...
research
08/02/2019

Differential Privacy for Sparse Classification Learning

In this paper, we present a differential privacy version of convex and n...
research
09/03/2022

Differentially-private Distributed Algorithms for Aggregative Games with Guaranteed Convergence

The distributed computation of a Nash equilibrium in aggregative games i...
research
10/18/2020

Decentralized and Secure Generation Maintenance with Differential Privacy

Decentralized methods are gaining popularity for data-driven models in p...

Please sign up or login with your details

Forgot password? Click here to reset