Differentially Private Histograms under Continual Observation: Streaming Selection into the Unknown

03/31/2021
by   Adrian Rivera Cardoso, et al.
0

We generalize the continuous observation privacy setting from Dwork et al. '10 and Chan et al. '11 by allowing each event in a stream to be a subset of some (possibly unknown) universe of items. We design differentially private (DP) algorithms for histograms in several settings, including top-k selection, with privacy loss that scales with polylog(T), where T is the maximum length of the input stream. We present a meta-algorithm that can use existing one-shot top-k DP algorithms as a subroutine to continuously release private histograms from a stream. Further, we present more practical DP algorithms for two settings: 1) continuously releasing the top-k counts from a histogram over a known domain when an event can consist of an arbitrary number of items, and 2) continuously releasing histograms over an unknown domain when an event has a limited number of items.

READ FULL TEXT

page 1

page 2

page 3

page 4

02/05/2021

Fast and Memory Efficient Differentially Private-SGD via JL Projections

Differentially Private-SGD (DP-SGD) of Abadi et al. (2016) and its varia...
06/05/2021

Numerical Composition of Differential Privacy

We give a fast algorithm to optimally compose privacy guarantees of diff...
02/03/2020

Differentially Private k-Means Clustering with Guaranteed Convergence

Iterative clustering algorithms help us to learn the insights behind the...
12/01/2021

The Price of Differential Privacy under Continual Observation

We study the accuracy of differentially private mechanisms in the contin...
10/08/2019

Differentially private anonymized histograms

For a dataset of label-count pairs, an anonymized histogram is the multi...
11/08/2018

Private Continual Release of Real-Valued Data Streams

We present a differentially private mechanism to display statistics (e.g...
06/28/2021

Differentially Private Algorithms for Graphs Under Continual Observation

Differentially private algorithms protect individuals in data analysis s...