DuetSGX: Differential Privacy with Secure Hardware

10/20/2020
by   Phillip Nguyen, et al.
0

Differential privacy offers a formal privacy guarantee for individuals, but many deployments of differentially private systems require a trusted third party (the data curator). We propose DuetSGX, a system that uses secure hardware (Intel's SGX) to eliminate the need for a trusted data curator. Data owners submit encrypted data that can be decrypted only within a secure enclave running the DuetSGX system, ensuring that sensitive data is never available to the data curator. Analysts submit queries written in the Duet language, which is specifically designed for verifying that programs satisfy differential privacy; DuetSGX uses the Duet typechecker to verify that each query satisfies differential privacy before running it. DuetSGX therefore provides the benefits of local differential privacy and central differential privacy simultaneously: noise is only added to final results, and there is no trusted third party. We have implemented a proof-of-concept implementation of DuetSGX and we release it as open-source.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/20/2019

Outis: Crypto-Assisted Differential Privacy on Untrusted Servers

Differential privacy has steadily become the de-facto standard for achie...
research
09/10/2019

A Programming Framework for Differential Privacy with Accuracy Concentration Bounds

Differential privacy offers a formal framework for reasoning about priva...
research
08/18/2022

Verifiable Differential Privacy For When The Curious Become Dishonest

Many applications seek to produce differentially private statistics on s...
research
05/03/2019

Locally Differentially Private Naive Bayes Classification

In machine learning, classification models need to be trained in order t...
research
09/06/2020

Randomness Concerns When Deploying Differential Privacy

The U.S. Census Bureau is using differential privacy (DP) to protect con...
research
03/01/2017

Preserving Differential Privacy Between Features in Distributed Estimation

Privacy is crucial in many applications of machine learning. Legal, ethi...
research
09/13/2023

SHIELD: Secure Haplotype Imputation Employing Local Differential Privacy

We introduce Secure Haplotype Imputation Employing Local Differential pr...

Please sign up or login with your details

Forgot password? Click here to reset