Canonical Noise Distributions and Private Hypothesis Tests
f-DP has recently been proposed as a generalization of classical definitions of differential privacy allowing a lossless analysis of composition, post-processing, and privacy amplification via subsampling. In the setting of f-DP, we propose the concept canonical noise distribution (CND) which captures whether an additive privacy mechanism is appropriately tailored for a given f, and give a construction that produces a CND given an arbitrary tradeoff function f. We show that private hypothesis tests are intimately related to CNDs, allowing for the release of private p-values at no additional privacy cost as well as the construction of uniformly most powerful (UMP) tests for binary data. We apply our techniques to the problem of difference of proportions testing, and construct a UMP unbiased "semi-private" test which upper bounds the performance of any DP test. Using this as a benchmark we propose a private test, based on the inversion of characteristic functions, which allows for optimal inference for the two population parameters and is nearly as powerful as the semi-private UMPU. When specialized to the case of (ϵ,0)-DP, we show empirically that our proposed test is more powerful than any (ϵ/√(2))-DP test and has more accurate type I errors than the classic normal approximation test.
READ FULL TEXT