Choosing Public Datasets for Private Machine Learning via Gradient Subspace Distance

03/02/2023
by   Xin Gu, et al.
0

Differentially private stochastic gradient descent privatizes model training by injecting noise into each iteration, where the noise magnitude increases with the number of model parameters. Recent works suggest that we can reduce the noise by leveraging public data for private machine learning, by projecting gradients onto a subspace prescribed by the public data. However, given a choice of public datasets, it is not a priori clear which one may be most appropriate for the private task. We give an algorithm for selecting a public dataset by measuring a low-dimensional subspace distance between gradients of the public and private examples. We provide theoretical analysis demonstrating that the excess risk scales with this subspace distance. This distance is easy to compute and robust to modifications in the setting. Empirical evaluation shows that trained model accuracy is monotone in this distance.

READ FULL TEXT
research
07/07/2020

Bypassing the Ambient Dimension: Private SGD with Gradient Subspace Identification

Differentially private SGD (DP-SGD) is one of the most popular methods f...
research
02/17/2021

Leveraging Public Data for Practical Private Query Release

In many statistical problems, incorporating priors can significantly imp...
research
07/06/2022

Scaling Private Deep Learning with Low-Rank and Sparse Gradients

Applying Differentially Private Stochastic Gradient Descent (DPSGD) to t...
research
10/05/2022

Learning from aggregated data with a maximum entropy model

Aggregating a dataset, then injecting some noise, is a simple and common...
research
03/17/2022

Stochastic and Private Nonconvex Outlier-Robust PCA

We develop theoretically guaranteed stochastic methods for outlier-robus...
research
08/14/2020

Dimension Independence in Unconstrained Private ERM via Adaptive Preconditioning

In this paper we revisit the problem of private empirical risk minimziat...
research
12/08/2022

A Novel Stochastic Gradient Descent Algorithm for Learning Principal Subspaces

Many machine learning problems encode their data as a matrix with a poss...

Please sign up or login with your details

Forgot password? Click here to reset