Super-convergence and Differential Privacy: Training faster with better privacy guarantees

03/18/2021
by   Osvald Frisk, et al.
0

The combination of deep neural networks and Differential Privacy has been of increasing interest in recent years, as it offers important data protection guarantees to the individuals of the training datasets used. However, using Differential Privacy in the training of neural networks comes with a set of shortcomings, like a decrease in validation accuracy and a significant increase in the use of resources and time in training. In this paper, we examine super-convergence as a way of greatly increasing training speed of differentially private neural networks, addressing the shortcoming of high training time and resource use. Super-convergence allows for acceleration in network training using very high learning rates, and has been shown to achieve models with high utility in orders of magnitude less training iterations than conventional ways. Experiments in this paper show that this order-of-magnitude speedup can also be seen when combining it with Differential Privacy, allowing for higher validation accuracies in much fewer training iterations compared to non-private, non-super convergent baseline models. Furthermore, super-convergence is shown to improve the privacy guarantees of private models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/07/2020

Scaling up Differentially Private Deep Learning with Fast Per-Example Gradient Clipping

Recent work on Renyi Differential Privacy has shown the feasibility of a...
research
04/03/2019

Differentially Private Model Publishing for Deep Learning

Deep learning techniques based on neural networks have shown significant...
research
11/29/2021

Architecture Matters: Investigating the Influence of Differential Privacy on Neural Network Design

One barrier to more widespread adoption of differentially private neural...
research
12/14/2020

Robustness Threats of Differential Privacy

Differential privacy is a powerful and gold-standard concept of measurin...
research
08/23/2017

Super-Convergence: Very Fast Training of Residual Networks Using Large Learning Rates

In this paper, we show a phenomenon, which we named "super-convergence",...
research
06/11/2023

Fast, Distribution-free Predictive Inference for Neural Networks with Coverage Guarantees

This paper introduces a novel, computationally-efficient algorithm for p...
research
08/08/2019

That which we call private

A casual reader of the study by Jayaraman and Evans in USENIX Security 2...

Please sign up or login with your details

Forgot password? Click here to reset