DeepAI AI Chat
Log In Sign Up

Differentially Private Kernel Inducing Points (DP-KIP) for Privacy-preserving Data Distillation

by   Margarita Vinaroz, et al.

While it is tempting to believe that data distillation preserves privacy, distilled data's empirical robustness against known attacks does not imply a provable privacy guarantee. Here, we develop a provably privacy-preserving data distillation algorithm, called differentially private kernel inducing points (DP-KIP). DP-KIP is an instantiation of DP-SGD on kernel ridge regression (KRR). Following a recent work, we use neural tangent kernels and minimize the KRR loss to estimate the distilled datapoints (i.e., kernel inducing points). We provide a computationally efficient JAX implementation of DP-KIP, which we test on several popular image and tabular datasets to show its efficacy in data distillation with differential privacy guarantees.


No-Regret Algorithms for Private Gaussian Process Bandit Optimization

The widespread proliferation of data-driven decision-making has ushered ...

Intertwining Order Preserving Encryption and Differential Privacy

Ciphertexts of an order-preserving encryption (OPE) scheme preserve the ...

DP-InstaHide: Provably Defusing Poisoning and Backdoor Attacks with Differentially Private Data Augmentations

Data poisoning and backdoor attacks manipulate training data to induce s...

Privacy, Security, and Utility Analysis of Differentially Private CPES Data

Differential privacy (DP) has been widely used to protect the privacy of...

Deconvoluting Kernel Density Estimation and Regression for Locally Differentially Private Data

Local differential privacy has become the gold-standard of privacy liter...

Characterizing Differentially-Private Techniques in the Era of Internet-of-Vehicles

Recent developments of advanced Human-Vehicle Interactions rely on the c...

Differentially-Private Publication of Origin-Destination Matrices with Intermediate Stops

Conventional origin-destination (OD) matrices record the count of trips ...