On the Sample Complexity of Privately Learning Axis-Aligned Rectangles

07/24/2021
by   Menachem Sadigurschi, et al.
0

We revisit the fundamental problem of learning Axis-Aligned-Rectangles over a finite grid X^d⊆ℝ^d with differential privacy. Existing results show that the sample complexity of this problem is at most min{ d·log|X| , d^1.5·(log^*|X| )^1.5}. That is, existing constructions either require sample complexity that grows linearly with log|X|, or else it grows super linearly with the dimension d. We present a novel algorithm that reduces the sample complexity to only Õ{d·(log^*|X|)^1.5}, attaining a dimensionality optimal dependency without requiring the sample complexity to grow with log|X|.The technique used in order to attain this improvement involves the deletion of "exposed" data-points on the go, in a fashion designed to avoid the cost of the adaptive composition theorems. The core of this technique may be of individual interest, introducing a new method for constructing statistically-efficient private algorithms.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset