On the Importance of Architecture and Feature Selection in Differentially Private Machine Learning

05/13/2022
by   Wenxuan Bao, et al.
0

We study a pitfall in the typical workflow for differentially private machine learning. The use of differentially private learning algorithms in a "drop-in" fashion – without accounting for the impact of differential privacy (DP) noise when choosing what feature engineering operations to use, what features to select, or what neural network architecture to use – yields overly complex and poorly performing models. In other words, by anticipating the impact of DP noise, a simpler and more accurate alternative model could have been trained for the same privacy guarantee. We systematically study this phenomenon through theory and experiments. On the theory front, we provide an explanatory framework and prove that the phenomenon arises naturally from the addition of noise to satisfy differential privacy. On the experimental front, we demonstrate how the phenomenon manifests in practice using various datasets, types of models, tasks, and neural network architectures. We also analyze the factors that contribute to the problem and distill our experimental insights into concrete takeaways that practitioners can follow when training models with differential privacy. Finally, we propose privacy-aware algorithms for feature selection and neural network architecture search. We analyze their differential privacy properties and evaluate them empirically.

READ FULL TEXT
research
11/29/2021

Architecture Matters: Investigating the Influence of Differential Privacy on Neural Network Design

One barrier to more widespread adoption of differentially private neural...
research
07/14/2021

Towards Quantifying the Carbon Emissions of Differentially Private Machine Learning

In recent years, machine learning techniques utilizing large-scale datas...
research
07/19/2023

The importance of feature preprocessing for differentially private linear optimization

Training machine learning models with differential privacy (DP) has rece...
research
01/28/2019

Improved Accounting for Differentially Private Learning

We consider the problem of differential privacy accounting, i.e. estimat...
research
06/16/2020

Differentially-private Federated Neural Architecture Search

Neural architecture search, which aims to automatically search for archi...
research
05/24/2022

DPSNN: A Differentially Private Spiking Neural Network

Privacy-preserving is a key problem for the machine learning algorithm. ...
research
07/22/2021

Differentially Private Algorithms for 2020 Census Detailed DHC Race & Ethnicity

This article describes a proposed differentially private (DP) algorithms...

Please sign up or login with your details

Forgot password? Click here to reset