Scaling Private Deep Learning with Low-Rank and Sparse Gradients

07/06/2022
by   Ryuichi Ito, et al.
0

Applying Differentially Private Stochastic Gradient Descent (DPSGD) to training modern, large-scale neural networks such as transformer-based models is a challenging task, as the magnitude of noise added to the gradients at each iteration scales with model dimension, hindering the learning capability significantly. We propose a unified framework, , that fully exploits the low-rank and sparse structure of neural networks to reduce the dimension of gradient updates, and hence alleviate the negative impacts of DPSGD. The gradient updates are first approximated with a pair of low-rank matrices. Then, a novel strategy is utilized to sparsify the gradients, resulting in low-dimensional, less noisy updates that are yet capable of retaining the performance of neural networks. Empirical evaluation on natural language processing and computer vision tasks shows that our method outperforms other state-of-the-art baselines.

READ FULL TEXT
research
06/17/2021

Large Scale Private Learning via Low-rank Reparametrization

We propose a reparametrization scheme to address the challenges of apply...
research
03/02/2023

Choosing Public Datasets for Private Machine Learning via Gradient Subspace Distance

Differentially private stochastic gradient descent privatizes model trai...
research
06/19/2020

Robust Differentially Private Training of Deep Neural Networks

Differentially private stochastic gradient descent (DPSGD) is a variatio...
research
03/10/2022

projUNN: efficient method for training deep networks with unitary matrices

In learning with recurrent or very deep feed-forward networks, employing...
research
07/07/2020

Bypassing the Ambient Dimension: Private SGD with Gradient Subspace Identification

Differentially private SGD (DP-SGD) is one of the most popular methods f...
research
01/02/2023

Training Differentially Private Graph Neural Networks with Random Walk Sampling

Deep learning models are known to put the privacy of their training data...
research
06/10/2020

Sketchy Empirical Natural Gradient Methods for Deep Learning

In this paper, we develop an efficient sketchy empirical natural gradien...

Please sign up or login with your details

Forgot password? Click here to reset