Noise-Augmented Privacy-Preserving Empirical Risk Minimization with Dual-purpose Regularizer and Privacy Budget Retrieval and Recycling

10/16/2021
by   Yinan Li, et al.
0

We propose Noise-Augmented Privacy-Preserving Empirical Risk Minimization (NAPP-ERM) that solves ERM with differential privacy guarantees. Existing privacy-preserving ERM approaches may be subject to over-regularization with the employment of an l2 term to achieve strong convexity on top of the target regularization. NAPP-ERM improves over the current approaches and mitigates over-regularization by iteratively realizing target regularization through appropriately designed augmented data and delivering strong convexity via a single adaptively weighted dual-purpose l2 regularizer. When the target regularization is for variable selection, we propose a new regularizer that achieves both privacy and sparsity guarantees simultaneously. Finally, we propose a strategy to retrieve privacy budget when the strong convexity requirement is met, which can be returned to users such that the DP of ERM is guaranteed at a lower privacy cost than originally planned, or be recycled to the ERM optimization procedure to reduce the injected DP noise and improve the utility of DP-ERM. From an implementation perspective, NAPP-ERM can be achieved by optimizing a non-perturbed object function given noise-augmented data and can thus leverage existing tools for non-private ERM optimization. We illustrate through extensive experiments the mitigation effect of the over-regularization and private budget retrieval by NAPP-ERM on variable selection and prediction.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/30/2021

Dynamic Differential-Privacy Preserving SGD

Differentially-Private Stochastic Gradient Descent (DP-SGD) prevents tra...
research
07/24/2023

A Differentially Private Weighted Empirical Risk Minimization Procedure and its Application to Outcome Weighted Learning

It is commonplace to use data containing personal information to build p...
research
11/27/2021

Towards Understanding the Impact of Model Size on Differential Private Classification

Differential privacy (DP) is an essential technique for privacy-preservi...
research
03/02/2020

Differential Privacy at Risk: Bridging Randomness and Privacy Budget

The calibration of noise for a privacy-preserving mechanism depends on t...
research
03/12/2021

Privacy Regularization: Joint Privacy-Utility Optimization in Language Models

Neural language models are known to have a high capacity for memorizatio...
research
09/12/2023

Chained-DP: Can We Recycle Privacy Budget?

Privacy-preserving vector mean estimation is a crucial primitive in fede...
research
01/05/2021

Community Preserved Social Graph Publishing with Node Differential Privacy

The goal of privacy-preserving social graph publishing is to protect ind...

Please sign up or login with your details

Forgot password? Click here to reset