-
Local Differential Privacy for Bayesian Optimization
Motivated by the increasing concern about privacy in nowadays data-inten...
read it
-
Differentially Private Gaussian Processes
A major challenge for machine learning is increasing the availability of...
read it
-
Differentially Private Language Models Benefit from Public Pre-training
Language modeling is a keystone task in natural language processing. Whe...
read it
-
(Locally) Differentially Private Combinatorial Semi-Bandits
In this paper, we study Combinatorial Semi-Bandits (CSB) that is an exte...
read it
-
Differentially Private Online Submodular Optimization
In this paper we develop the first algorithms for online submodular mini...
read it
-
Mitigating Bias in Adaptive Data Gathering via Differential Privacy
Data that is gathered adaptively --- via bandit algorithms, for example ...
read it
-
On The Differential Privacy of Thompson Sampling With Gaussian Prior
We show that Thompson Sampling with Gaussian Prior as detailed by Algori...
read it
No-Regret Algorithms for Private Gaussian Process Bandit Optimization
The widespread proliferation of data-driven decision-making has ushered in a recent interest in the design of privacy-preserving algorithms. In this paper, we consider the ubiquitous problem of gaussian process (GP) bandit optimization from the lens of privacy-preserving statistics. We propose a solution for differentially private GP bandit optimization that combines a uniform kernel approximator with random perturbations, providing a generic framework to create differentially-private (DP) Gaussian process bandit algorithms. For two specific DP settings - joint and local differential privacy, we provide algorithms based on efficient quadrature Fourier feature approximators, that are computationally efficient and provably no-regret for popular stationary kernel functions. Our algorithms maintain differential privacy throughout the optimization procedure and critically do not rely explicitly on the sample path for prediction, making the parameters straightforward to release as well.
READ FULL TEXT
Comments
There are no comments yet.