Robust Optimization for Local Differential Privacy

05/10/2022
by   Jasper Goseling, et al.
0

We consider the setting of publishing data without leaking sensitive information. We do so in the framework of Robust Local Differential Privacy (RLDP). This ensures privacy for all distributions of the data in an uncertainty set. We formulate the problem of finding the optimal data release protocol as a robust optimization problem. By deriving closed-form expressions for the duals of the constraints involved we obtain a convex optimization problem. We compare the performance of four possible optimization problems depending on whether or not we require robustness in i) utility and ii) privacy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/22/2021

The Privacy-Utility Tradeoff of Robust Local Differential Privacy

We consider data release protocols for data X=(S,U), where S is sensitiv...
research
02/13/2016

Convex Optimization for Linear Query Processing under Approximate Differential Privacy

Differential privacy enables organizations to collect accurate aggregate...
research
06/15/2018

Customized Local Differential Privacy for Multi-Agent Distributed Optimization

Real-time data-driven optimization and control problems over networks ma...
research
04/25/2023

Differential Privacy via Distributionally Robust Optimization

In recent years, differential privacy has emerged as the de facto standa...
research
09/18/2020

Quickest Change Detection with Privacy Constraint

This paper considers Lorden's minimax quickest change detection (QCD) pr...
research
02/10/2021

Heuristic Strategies for Solving Complex Interacting Stockpile Blending Problem with Chance Constraints

Heuristic algorithms have shown a good ability to solve a variety of opt...
research
10/02/2017

Constrained Differential Privacy for Count Data

Concern about how to aggregate sensitive user data without compromising ...

Please sign up or login with your details

Forgot password? Click here to reset