Exploring Differential Obliviousness

05/03/2019
by   Amos Beimel, et al.
0

In a recent paper Chan et al. [SODA '19] proposed a relaxation of the notion of (full) memory obliviousness, which was introduced by Goldreich and Ostrovsky [J. ACM '96] and extensively researched by cryptographers. The new notion, differential obliviousness, requires that any two neighboring inputs exhibit similar memory access patterns, where the similarity requirement is that of differential privacy. Chan et al. demonstrated that differential obliviousness allows achieving improved efficiency for several algorithmic tasks, including sorting, merging of sorted lists, and range query data structures. In this work, we continue the exploration and mapping of differential obliviousness, focusing on algorithms that do not necessarily examine all their input. This choice is motivated by the fact that the existence of logarithmic overhead ORAM protocols implies that differential obliviousness can yield at most a logarithmic improvement in efficiency for computations that need to examine all their input. In particular, we explore property testing, where we show that differential obliviousness yields an almost linear improvement in overhead in the dense graph model, and at most quadratic improvement in the bounded degree model. We also explore tasks where a non-oblivious algorithm would need to explore different portions of the input, where the latter would depend on the input itself, and where we show that such a behavior can be maintained under differential obliviousness, but not under full obliviousness. Our examples suggest that there would be benefits in further exploring which class of computational tasks are amenable to differential obliviousness.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/07/2019

The Privacy Blanket of the Shuffle Model

This work studies differential privacy in the context of the recently pr...
research
05/25/2023

Learning across Data Owners with Joint Differential Privacy

In this paper, we study the setting in which data owners train machine l...
research
07/03/2019

Capacity Bounded Differential Privacy

Differential privacy, a notion of algorithmic stability, is a gold stand...
research
02/02/2022

Improved quantum algorithms for linear and nonlinear differential equations

We present substantially generalized and improved quantum algorithms ove...
research
07/10/2014

Private Learning and Sanitization: Pure vs. Approximate Differential Privacy

We compare the sample complexity of private learning [Kasiviswanathan et...
research
11/02/2019

Adaptive Statistical Learning with Bayesian Differential Privacy

In statistical learning, a dataset is often partitioned into two parts: ...

Please sign up or login with your details

Forgot password? Click here to reset