Pre-Pruning and Gradient-Dropping Improve Differentially Private Image Classification

06/19/2023
by   Kamil Adamczewski, et al.
0

Scalability is a significant challenge when it comes to applying differential privacy to training deep neural networks. The commonly used DP-SGD algorithm struggles to maintain a high level of privacy protection while achieving high accuracy on even moderately sized models. To tackle this challenge, we take advantage of the fact that neural networks are overparameterized, which allows us to improve neural network training with differential privacy. Specifically, we introduce a new training paradigm that uses pre-pruning and gradient-dropping to reduce the parameter space and improve scalability. The process starts with pre-pruning the parameters of the original network to obtain a smaller model that is then trained with DP-SGD. During training, less important gradients are dropped, and only selected gradients are updated. Our training paradigm introduces a tension between the rates of pre-pruning and gradient-dropping, privacy loss, and classification accuracy. Too much pre-pruning and gradient-dropping reduces the model's capacity and worsens accuracy, while training a smaller model requires less privacy budget for achieving good accuracy. We evaluate the interplay between these factors and demonstrate the effectiveness of our training paradigm for both training from scratch and fine-tuning pre-trained networks on several benchmark image classification datasets. The tools can also be readily incorporated into existing training paradigms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/08/2023

Differential Privacy Meets Neural Network Pruning

A major challenge in applying differential privacy to training deep neur...
research
10/18/2022

DPIS: An Enhanced Mechanism for Differentially Private SGD with Importance Sampling

Nowadays, differential privacy (DP) has become a well-accepted standard ...
research
06/27/2023

Differentially Private Video Activity Recognition

In recent years, differential privacy has seen significant advancements ...
research
03/22/2022

Mixed Differential Privacy in Computer Vision

We introduce AdaMix, an adaptive differentially private algorithm for tr...
research
09/12/2023

Exploring the Benefits of Differentially Private Pre-training and Parameter-Efficient Fine-tuning for Table Transformers

For machine learning with tabular data, Table Transformer (TabTransforme...
research
03/01/2021

Wide Network Learning with Differential Privacy

Despite intense interest and considerable effort, the current generation...
research
05/19/2023

Differentially Private Adapters for Parameter Efficient Acoustic Modeling

In this work, we devise a parameter-efficient solution to bring differen...

Please sign up or login with your details

Forgot password? Click here to reset