Efficient Hyperparameter Optimization for Differentially Private Deep Learning

08/09/2021
by   Aman Priyanshu, et al.
129

Tuning the hyperparameters in the differentially private stochastic gradient descent (DPSGD) is a fundamental challenge. Unlike the typical SGD, private datasets cannot be used many times for hyperparameter search in DPSGD; e.g., via a grid search. Therefore, there is an essential need for algorithms that, within a given search space, can find near-optimal hyperparameters for the best achievable privacy-utility tradeoffs efficiently. We formulate this problem into a general optimization framework for establishing a desirable privacy-utility tradeoff, and systematically study three cost-effective algorithms for being used in the proposed framework: evolutionary, Bayesian, and reinforcement learning. Our experiments, for hyperparameter tuning in DPSGD conducted on MNIST and CIFAR-10 datasets, show that these three algorithms significantly outperform the widely used grid search baseline. As this paper offers a first-of-a-kind framework for hyperparameter tuning in DPSGD, we discuss existing challenges and open directions for future studies. As we believe our work has implications to be utilized in the pipeline of private deep learning, we open-source our code at https://github.com/AmanPriyanshu/DP-HyperparamTuning.

READ FULL TEXT
research
10/07/2021

Hyperparameter Tuning with Renyi Differential Privacy

For many differentially private algorithms, such as the prominent noisy ...
research
06/24/2023

Differentially Private Decentralized Deep Learning with Consensus Algorithms

Cooperative decentralized deep learning relies on direct information exc...
research
11/09/2021

The Role of Adaptive Optimizers for Honest Private Hyperparameter Selection

Hyperparameter optimization is a ubiquitous challenge in machine learnin...
research
10/22/2021

Differentially Private Coordinate Descent for Composite Empirical Risk Minimization

Machine learning models can leak information about the data used to trai...
research
11/19/2018

Private Selection from Private Candidates

Differentially Private algorithms often need to select the best amongst ...
research
06/09/2023

DP-HyPO: An Adaptive Private Hyperparameter Optimization Framework

Hyperparameter optimization, also known as hyperparameter tuning, is a w...
research
11/03/2022

Revisiting Hyperparameter Tuning with Differential Privacy

Hyperparameter tuning is a common practice in the application of machine...

Please sign up or login with your details

Forgot password? Click here to reset