Fuzzi: A Three-Level Logic for Differential Privacy

05/29/2019
by   Hengchu Zhang, et al.
0

Curators of sensitive datasets sometimes need to know whether queries against the data are differentially private [Dwork et al. 2006]. Two sorts of logics have been proposed for checking this property: (1) type systems and other static analyses, which fully automate straightforward reasoning with concepts like "program sensitivity" and "privacy loss," and (2) full-blown program logics such as apRHL (an approximate, probabilistic, relational Hoare logic) [Barthe et al. 2016], which support more flexible reasoning about subtle privacy-preserving algorithmic techniques but offer only minimal automation. We propose a three-level logic for differential privacy in an imperative setting and present a prototype implementation called Fuzzi. Fuzzi's lowest level is a general-purpose logic; its middle level is apRHL; and its top level is a novel sensitivity logic adapted from the linear-logic-inspired type system of Fuzz, a differentially private functional language [Reed and Pierce 2010]. The key novelty is a high degree of integration between the sensitivity logic and the two lower-level logics: the judgments and proofs of the sensitivity logic can be easily translated into apRHL; conversely, privacy properties of key algorithmic building blocks can be proved manually in apRHL and the base logic, then packaged up as typing rules that can be applied by a checker for the sensitivity logic to automatically construct privacy proofs for composite programs of arbitrary size. We demonstrate Fuzzi's utility by implementing four different private machine-learning algorithms and showing that Fuzzi's checker is able to derive tight sensitivity bounds.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/05/2019

Duet: An Expressive Higher-order Language and Linear Type System for Statically Enforcing Differential Privacy

During the past decade, differential privacy has become the gold standar...
research
06/08/2017

Pain-Free Random Differential Privacy with Sensitivity Sampling

Popular approaches to differential privacy, such as the Laplace and expo...
research
09/07/2021

When differential privacy meets NLP: The devil is in the detail

Differential privacy provides a formal approach to privacy of individual...
research
07/22/2020

Graded Hoare Logic and its Categorical Semantics

Deductive verification techniques, based on program logics (i.e., the fa...
research
02/03/2022

Bunched Fuzz: Sensitivity for Vector Metrics

"Program sensitivity" measures the distance between the outputs of a pro...
research
08/03/2023

Gradual Sensitivity Typing

Reasoning about the sensitivity of functions with respect to their input...
research
11/08/2019

The Complexity of Verifying Circuits as Differentially Private

We study the problem of verifying differential privacy for straight line...

Please sign up or login with your details

Forgot password? Click here to reset