Bayesian Differential Privacy for Linear Dynamical Systems
Differential privacy is a privacy measure based on the difficulty of discriminating between similar input data. In differential privacy analysis, similar data usually implies that their distance does not exceed a predetermined threshold. It, consequently, does not take into account the difficulty of distinguishing data sets that are far apart, which often contain highly private information. This problem has been pointed out in the research on differential privacy for static data, and Bayesian differential privacy has been proposed, which provides a privacy protection level even for outlier data by utilizing the prior distribution of the data. In this study, we introduce this Bayesian differential privacy to dynamical systems, and provide privacy guarantees for distant input data pairs and reveal its fundamental property. For example, we design a mechanism that satisfies the desired level of privacy protection, which characterizes the trade-off between privacy and information utility.
READ FULL TEXT