I Introduction
Differential privacy (DP) [1]
is one of the most popular privacy notions that have been studied in various areas, including databases, machine learning, geolocations, and social networks. The protection of
DP can be achieved by adding probabilistic noise to the data we want to obfuscate. In particular, many studies have proposed local obfuscation mechanisms [2, 3, 4] that perturb each single “point” datum (e.g., a geolocation point) by adding controlled probabilistic noise before sending it out to a data collector.Recent researches [5, 6, 7] show that local obfuscation mechanisms can be used to hide the probability distributions that lie behind such point data and implicitly represent sensitive attributes (e.g., age, gender, social status). In particular, [6] proposes the notion of distribution privacy (DistP) as the local DP of probability distributions. Roughly, DistP of a local obfuscation mechanism represents that the adversary cannot significantly gain information on the distribution of ’s input by observing ’s output. However, since DistP assumes the worst case risk in the sense of DP, it imposes strong requirement and might unnecessarily lose the utility of obfuscated data.
In this paper, we relax the notion of DistP by generalizing it to an arbitrary divergence. The basic idea is similar to point privacy notions that relax DP and improve utility by relying on some divergence (e.g., total variation privacy [8]
, KullbackLeibler divergence privacy
[8, 9], and Rényi differential privacy [10]). We define the notion of divergence distribution privacy by replacing the DPstyle with an arbitrary divergence . This relaxation allows us to formalize “onaverage” DistP, and to explore privacy notions against an adversary performing the statistical hypothesis test corresponding to the divergence
[8].Furthermore, we propose and investigate local obfuscation mechanisms that provide divergence DistP. Specifically, we consider the following two scenarios:

when we have no idea on the input distributions;
For the scenario (i), we clarify how much perturbation noise should be added to provide divergence DistP when we use an existing mechanism for obfuscating point data. For the scenario (ii), we introduce a local obfuscation mechanism that provides divergence DistP while optimizing the utility of obfuscated data by using the auxiliary information. Here it should be noted that probability coupling techniques are crucial in constructing divergence DistP mechanisms in both the scenarios.
Our contributions. The main contributions are as follows:

We introduce notions of divergence DistP and investigate theoretical properties of distribution obfuscation, especially the relationships between local distribution obfuscation and probability coupling.

We investigate the relationships among various notions of DistP based on divergences, such as KullbackLeibler divergence, which models “onaverage” risk.

In the scenario (i), we present how much divergence DistP can be achieved by local obfuscation. In particular, by using probability coupling techniques, we prove that the perturbation noise should be added proportionally to the Earth mover’s distance between the input distributions that we want to make indistinguishable.

In the scenario (ii), we propose a local obfuscation mechanism, called a (utilityoptimal) coupling mechanism, that provides divergence DistP while minimizing utility loss. The construction of the mechanism relies on solving an optimal transportation problem using probability coupling.

We theoretically evaluate the divergence DistP and utility loss of coupling mechanisms that can use exact/approximate knowledge on the input distributions.
Paper organization. The rest of this paper is organized as follows. Section II presents notations and background knowledge. Section III introduces notions of divergence DistP. Section IV investigates important properties of divergence DistP, and relationships among privacy notions. Section V shows that in the scenario (i), an privacy mechanism can provide divergence DistP. Section VI generalizes DistP to use exact/approximate information on the input distribution in the scenario (ii), and proposes a local mechanism for providing DistP while optimizing utility. Section VII discusses related work and Section VIII concludes.
Ii Preliminaries
In this section we recall some notions of privacy, divergence, and metrics used in this paper.
Iia Notations
Let be the set of nonnegative real numbers, and . Let , , and be the base of natural logarithm.
We denote by the number of elements in a finite set , and by the set of all probability distributions over a set . Given a probability distribution over a finite set , the probability of drawing a value from is denoted by . For a finite subset , we define by . For a distribution over a finite set , its support is .
For a randomized algorithm and a set , we denote by the probability that given an input , outputs one of the elements of . For a randomized algorithm and a distribution over , we define as the probability distribution of the output of . Formally, the lifting of is the function such that for any , .
IiB Differential Privacy
Differential privacy [1] is a notion of privacy guaranteeing that we cannot learn which of two “adjacent” inputs and is used to generate an output of a randomized algorithm. This notion is parameterized by a degree of indistinguishability, a ratio of exception, and some adjacency relation over a set of data. The formal definition is given as follows.
Definition 1 (Differential privacy)
A randomized algorithm provides differential privacy (DP) w.r.t. an adjacency relation if for any and any ,
where the probability is taken over the random choices in .
Then the protection of DP is stronger for smaller and .
DP can be achieved by a local obfuscation mechanism or privacy mechanism (illustrated in Fig. 1), namely a randomized algorithm that adds controlled noise probabilistically to given inputs that we want to protect.
IiC Extended Differential Privacy (Xdp)
The notion of DP can be relaxed by incorporating a metric over the set of input data. In [13] Chatzikokolakis et al. propose the notion of “privacy”, an extension of DP to a metric on input data. Intuitively, this notion guarantees that when two inputs and are closer in terms of , the output distributions are less distinguishable^{1}^{1}1Compared to DP, XDP provides weaker privacy and higher utility, as it obfuscates closer points. E.g., [14] shows the planar Laplace mechanism [3] (with XDP) adds less noise than the randomized response (with DP). . Here we show the definition of this extended DP equipped with .
Definition 2 (Extended differential privacy)
Let be a metric. We say that a randomized algorithm provides extended differential privacy (XDP) if for all and ,
where the probability is taken over the random choices in .
To achieve XDP, obfuscation mechanisms should add noise proportionally to the distance between the two inputs and that we want to make indistinguishable, hence more noise is require for a larger .
IiD Distribution Privacy and Extended Distribution Privacy
Distribution privacy (DistP) [6] is a privacy notion that measures how much information on the input distribution is leaked by an output of a randomized algorithm. For example, let (resp.
) be a (prior) probability distribution of the locations of the male (resp. female) users. When we observe an output of an obfuscation mechanism
and cannot learn whether the input to is drawn from or , then we say that provides DistP w.r.t. . Formally, DistP is defined as follows.Definition 3 (Distribution privacy)
Let and . We say that a randomized algorithm provides distribution privacy (DistP) w.r.t. an adjacency relation if its lifting provides DP w.r.t. , i.e., for all pairs and , we have
Next we recall an extension [6] of DistP with a metric as follows. Intuitively, this extended notion guarantees that when two input distributions are closer, then the output distributions must be less distinguishable.
Definition 4 (Extended distribution privacy)
Let be a metric, and . We say that a mechanism provides extended distribution privacy (XDistP) w.r.t. if the lifting provides XDP w.r.t. , i.e., for all and , we have
Analogously to XDP, noise should be added proportionally to the distance .
Divergence  

KLdivergence  
Reverse KLdivergence  
Total variation  
divergence  
Hellinger distance 
IiE Divergence
A divergence over a nonempty set is a function such that for all , (i) and (ii) iff . Note that a divergence may not be symmetric or subadditive. We denote by the set of all divergences over .
Next we recall the notion of (approximate) max divergence, which can be used to define DP.
Definition 5 (Max divergence)
Let and . Then approximate max divergence between , is:
We recall the notion of the divergences [15]. As shown in Table IID, many divergence notions (e.g. KullbackLeiblerdivergence [16]) are instances of divergence.
Definition 6 (divergence)
Let be the collection of functions defined by:
Let be a finite set, and such that for every , implies . Then for an , the divergence of from is defined as:

IiF Probability Coupling
We recall the notion of probability coupling as follows.
Example 1 (Coupling as transformation of distributions)
Let us consider two distributions and shown in Fig. 2. A coupling of and shows a way of transforming to . For example, moves from to .
Formally, a coupling is defined as follows.
Definition 7 (Coupling)
Given and , a coupling of and
is a joint distribution
such that and are ’s marginal distributions, i.e., for each , and for each , . We denote by the set of all couplings of and .IiG Wasserstein Metric
Then we recall the Wasserstein metric [17] between two distributions, which is defined using a coupling as follows.
Definition 8 (Wasserstein metric)
Let be a metric over , and . The Wasserstein metric w.r.t. is defined by: for any two distributions ,
is also called the Earth mover’s distance.
The intuitive meaning of is the minimum cost of transportation from to in transportation theory. As illustrated in Fig. 2, we regard the distribution (resp. ) as the set of points where each point has weight (resp. ), and we move some weight in from a point to another to construct . We represent by the amount of weight moved from to .^{2}^{2}2The amount of weight moved from a point in is given by , while the amount moved into in is given by . Hence is a coupling of and . We denote by the cost (i.e., distance) of move from to . Then the minimum cost of the whole transportation is:
E.g., in Fig. 2, when the cost function is the Euclid distance over (e.g., ), the transportation achieves the minimum cost .
Let the set of all couplings achieving ; i.e.,
Then can be efficiently computed by the NorthWest corner rule [18] when is submodular ^{3}^{3}3 is submodular if ..
Iii Divergence Distribution Privacy
In this section we introduce new definitions of distribution privacy generalized to an arbitrary divergence . The main motivation is to discuss distribution privacy based on divergences, especially KullbackLeibler divergence, which models “onaverage” risk.
Iiia Divergence Dp and Divergence Xdp
To generalize distribution privacy notions, we first present a generalized formulation of point privacy parameterized with a divergence . Intuitively, we say that a randomized algorithm provides DP if a divergence cannot distinguish the input to by observing an output of .
Definition 9 (Divergence Dp w.r.t. adjacency relation)
For an adjacency relation and a divergence , we say that a randomized algorithm provides DP w.r.t. if for all , we have .
Note that some instances of divergence DP are known in the literature. In [8], DP is called divergence privacy, DP (KLP) is called KLprivacy, and DP is called total variation privacy. Furthermore, DP is equivalent to DP, since it is known that DP can be defined using the approximate max divergence as follows:
Proposition 1
A randomized algorithm provides DP w.r.t. iff for any , and .
Next we generalize the notion of extended differential privacy (XDP) to an arbitrary divergence as follows.
Definition 10 (Divergence Xdp)
Let be a metric, , and . We say that a randomized algorithm provides XDP w.r.t. if for all ,
These notions will be used to define (extended) divergence distribution privacy in the next section.
IiiB Divergence DistP and Divergence XDistP
In this section we generalize the notion of (extended) distribution privacy to an arbitrary divergence . The main aim of generalization is to present theoretical properties of distribution privacy in a more general form, and also to discuss distribution privacy based on the divergences.
Intuitively, we say that a randomized algorithm provides distribution privacy w.r.t. a set of pairs of distributions if for each pair , a divergence cannot distinguish which distribution (of and ) is used to generate ’s input value.
Definition 11 (Divergence DistP)
Let , and . We say that a randomized algorithm provides distribution privacy (DistP) w.r.t. if the lifting provides DP w.r.t. , i.e., for all ,
As with the generalization of DP to divergence [8], DistP expresses privacy against an adversary performing the hypothesis test corresponding to the divergence . When involves averaging (e.g., ), DistP formalizes “onaverage” privacy, which relaxes the original DistP.
Next we introduce XDistP parameterized with a divergence . Intuitively, XDistP with a divergence guarantees that when two input distributions and are closer (in terms of a metric ), then the output distributions and must be less distinguishable (in terms of ).
Definition 12 (Divergence XDistP)
Let be a metric over , , and . We say that a randomized algorithm provides extended distribution privacy (XDistP) w.r.t. if the lifting provides XDP w.r.t. , i.e., for all ,
Iv Properties of Divergence Distribution Privacy
In this section we show useful properties of divergence distribution privacy, such as compositionality and relationships among distribution privacy notions.
Iva Basic Properties of Divergence Distribution Privacy
In Tables II and III we summarize the results on two kinds of sequential compositions (Fig. 2(a)) and (Fig. 2(b)), postprocessing, and preprocessing for divergence DistP and for divergence XDistP, respectively. We present the details and proofs for these results in Appendices D, E, and F.
The two kinds of composition have been studies in previous work (e.g., [19, 6]). For two mechanisms and , the composition means that an identical input value is given to two DistP mechanisms and , whereas means that independent inputs are provided to mechanisms . Note that this kind of composition is adaptive in the sense that the output of can be dependent on that of . Hence the compositonality does not hold in general for divergence, whereas we show the compositionality for KLdivergence in Tables II and III. For nonadaptive sequential composition, the compositionality of divergence DistP/XDistP is straightforward from [20], which show the compositionality of popular divergences, including total variation and Hellinger distance.
As for preprocessing, we use the following definition of stability [6], which is analogous to the stability for DP.
Definition 13 (Stability)
Let , , and be a metric over . A transformation is stable if for any , can be reached from at most steps over . Analogously, is stable if for any , .
Sequential composition ()  is DistP 

is DistP  
Sequential composition ()  is DistP 
is DistP  
Postprocessing  is DistP is DistP 
Preprocessing (by stable )  is DistP is DistP 
Sequential composition ()  is XDistP 

is XDistP  
Sequential composition ()  is XDistP 
is XDistP  
Postprocessing  is XDistP is XDistP 
Preprocessing (by stable )  is XDistP is XDistP 
IvB Relationships among Distribution Privacy Notions
V Local Mechanisms for Divergence Distribution Privacy
In this section we present how much degree of divergence DistP/XDistP can be achieved by local obfuscation. Specifically, we show how divergence privacy contribute to the obfuscation of probability distributions. To prove those results, we use the notion of probability coupling.
Va Divergence DistP by Local Obfuscation
We first show that divergence privacy mechanisms provide DistP. To present this formally, we recall the notion of the lifting of relations as follows.
Definition 14 (Lifting of relations)
Given a relation , the lifting of is the maximum relation such that for any , there exists a coupling satisfying .
Intuitively, when and are adjacent w.r.t. the lifted relation , then we can construct from according to the coupling , that is, only by moving mass from to where (i.e., is adjacent to ). Note that by Definition 7, the coupling is a probability distribution over whose marginal distributions are and . If , then .
Now we show that every divergence privacy mechanism provides DistP as follows. (See Appendix A for the proof.)
Theorem 1 (Dp DistP)
Let . If a randomized algorithm provides DP w.r.t. , then it provides DistP w.r.t. .
Intuitively, the divergence privacy mechanism makes any pair of input distributions in indistinguishable in terms of up to the threshold .
VB Divergence XDistP by Local Obfuscation
Next we investigate how much noise should be added for local obfuscation mechanisms to provide divergence XDistP.
We first consider two point distributions at and at , i.e., . Then an XDP mechanism satisfies:
Hence the noise added by should be proportional to the distance between and .
To generalize this observation on point distributions to arbitrary distributions, we need to employ some metric between distributions. As the metric, we could use the diameter over the supports, which is defined by:
or the Wasserstein metric , which is used for XDistP [6]
. However, when there is an outlier in
or , then and tend to be large. Since the mechanism needs to add noise proportionally to the distance or to achieve XDistP, it needs to add large amount of noise and thus loses utility significantly.To have better utility, we employ the Earth mover’s distance (Wasserstein metric) as a metric for XDistP mechanisms. Given two distributions and over , we consider a transportation from to that minimizes the expected cost of the transportation. Then the minimum of the expected cost is given by the Earth mover’s distance .
Now we show that, to achieve XDistP, we only have to add noise proportionally to the Earth mover’s distance between the input distributions. To formalize this, we define a lifted relation as the maximum relation over s.t. for any , there is a coupling satisfying and .
Theorem 2 (Xdp XDistP)
Let be a metric. If a randomized algorithm provides XDP w.r.t. then it provides XDistP w.r.t. .
See Appendix A for the proof. Since the Earth mover’s distance is not grater than the diameter or Wasserstein distance, XDistP may require less noise than XDistP.
Vi Local Distribution Obfuscation with Auxiliary Inputs
In this section we introduce a local obfuscation mechanism which we call a coupling mechanism in order to provide distribution privacy while optimizing utility. Specifically, a coupling mechanism uses (full or approximate) knowledge on the input probability distributions to perturb each single input value so that the output distribution gets indistinguishable from some target probability distribution. To define the mechanism, we calculate the probability coupling of each input distribution and the target distribution.
Via Privacy Definitions with Auxiliary Inputs
We first extend the definition of divergence DistP so that a local obfuscation mechanism can receive some auxiliary input (e.g. context information) ranging over a set , which might be used for to apply different randomized algorithms in different situations or to different input distributions.
Definition 15 (Divergence DistP with auxiliary inputs)
Let , , and . We say that a randomized algorithm provides distribution privacy w.r.t. if for all pairs ,
In this definition, the auxiliary input over typically represents contextual information about where the obfuscation mechanism is used or what distribution an input is sampled from. Such information may be useful to customize to improve utility while providing distribution privacy in specific situations. For example, assume that each auxiliary input represents the fact that an input is sampled from a distribution . If a local mechanism uses this auxiliary information to always produce a distribution of outputs^{4}^{4}4If can use no auxiliary information but wants to produce , then the output value needs to be independent of the input, hence very poor utility., it can prevent the leakage of information on the input distribution . We elaborate on this in the next sections.
ViB Coupling Mechanisms
In this section we introduce a new local obfuscation mechanism, which we call a coupling mechanism. The aim of the new mechanism is to improve the utility while protecting distribution privacy when we know the input distribution fully or approximately. Intuitively, a coupling mechanism uses (full or partial) information on the input distribution and produces an output value following some identical distribution , which we call a target distribution. More specifically, given some auxiliary information about , a coupling mechanism probabilistically maps each input value to some output value so that is distributed over the target distribution .
The simplest construction of a coupling mechanism would be to randomly sample a value from independently of the input . However, this mechanism provides very poor utility, since the output loses all information on .
Instead, we construct a mechanism by calculating a coupling that transforms to with the minimum loss. We explain this using a simple example below.
Example 2 (Coupling mechanism)
A coupling of two distributions and (Fig. 1(b)) shows a way of transforming to by probabilistically adding noise to each single input value drawn from . More specifically, means that (out of ) moves from to , and means that moves from to . Based on this coupling , we construct the coupling mechanism that maps the input to the output with probability , and to the output with probability . By applying this mechanism to the input distribution , the resulting output distribution is identical to .
Formally, we assume that for each auxiliary input , we learn that the input distribution is approximately while the actual distribution is . Then we define the coupling mechanism as follows.
Definition 16 (Coupling mechanism)
Let . For each , let be an approximate input distribution, and be a coupling of and . Then a coupling mechanism w.r.t. is defined as a randomized algorithm such that given and , outputs with the probability:
When can access the exact information on (i.e., is identical to the actual distribution from which inputs are sampled), then provides DistP for any divergence , i.e., no information on the input distribution is leaked by the output of . However, we often obtain only approximate information on the input distribution. In this case, still provides strong privacy as shown in the next section.
ViC Distribution Privacy of Coupling Mechanisms
In this section we evaluate the DistP and utility of coupling mechanisms. (See Appendix C for the proof.)
Theorem 3 (DistP of the coupling mechanism)
Let such that each element of is of the form for some . Let be a coupling mechanism w.r.t. a target distribution . Assume that for each , the approximate knowledge is close to the actual distribution in the sense that and . Then provides:

DistP w.r.t. ;

DistP w.r.t. ;

DistP w.r.t. .
This theorem implies that when the mechanism learns the exact distribution, i.e., , then by it provides DistP, hence there is no leaked information on the input distributions. For , we have , hence provides approximately DistP .
ViD UtilityOptimal Coupling Mechanisms
In this section we introduce a utilityoptimal coupling mechanism. Here we assume that there is some metric over . Then the notion of utility loss of a local obfuscation mechanism is defined as follows.
Definition 17 (Expected utility loss)
Given an input distribution and a metric over , the expected utility loss of a randomized algorithm is:
The utility loss of a coupling mechanism depends on the choice of the coupling used in the mechanism. Given an Euclid distance and an input distribution , the expected utility loss of a coupling mechanism w.r.t. a target distribution using a coupling is represented by .
Now we define the coupling mechanism that minimizes the expected utility loss as follows.
Definition 18 (Utilityoptimal coupling mechanism)
Let . A utilityoptimal coupling mechanism w.r.t. is a coupling mechanism w.r.t. that uses a coupling for each .
Proposition 2 (Loss of the coupling mechanism)
For each , the expected utility loss of a utilityoptimal coupling mechanism w.r.t. a target distribution is given by the Earth mover’s distance .
The proof is straightforward from the definition of the Earth mover’s distance. Note that as mentioned in Section IIG, the coupling can be efficiently calculated by the NorthWest corner rule when is submodular.
Analogously, we could define a coupling mechanism that minimizes the maximum loss by using a coupling for each . Then the worstcase utility loss is given by the Wasserstein metric .
Vii Related work
Since the seminal work of Dwork [1] on differential privacy (DP), a lot of its variants have been studied to provide different types of privacy guarantees; e.g., privacy [13], divergence privacy [20, 8], mutualinformation DP [9], concentrated DP [21], Rényi DP [10], binary/ternary DP [22], Pufferfish privacy [23], Bayesian DP [24], local DP [2], personalized DP [25], and utilityoptimized local DP [26]. All of these are intended to protect single input values instead of input distributions.
A few researches have explored the privacy of distributions. Jelasity et al. [5] propose distributional DP to protect the privacy of distribution parameters in a Bayesian style (unlike DP and DistP). Kawamoto et al. [6] propose the DistP notion in a DP style. Geumlek et al. [7] propose profilebased privacy, a variant of DistP that allows the mechanisms to depend on the perfect knowledge of input distributions. However, these studies deal only with the worstcase risk, and neither relax them to the averagecase risk (with divergence) nor allow them to use arbitrary auxiliary information (in spite that available information on input distributions is often approximate only).
There have been many studies (e.g., [27]) on the DP of histogram publishing, which is different from DistP as follows. Histogram publishing is a central mechanism that hides a single record and outputs an obfuscated histogram, e.g., , whereas a DistP mechanism is a local mechanism that aims at hiding an input distribution and outputs a single perturbed value . As explained in [6], neither of these implies the other.
Viii Conclusion
We introduced the notions of divergence DistP and presented their useful theoretical properties in a general form. By using probability coupling techniques, we presented how much divergence DistP can be achieved by local obfuscation. In particular, we proved that the perturbation noise should be added proportionally to the Earth mover’s distance between the input distributions. We also proposed a local mechanism called a (utilityoptimal) coupling mechanism and theoretically evaluated their DistP and utility loss in the presence of (exact or approximate) knowledge on the input distributions.
As for future work, we are planning to develop various kinds of coupling mechanisms for specific applications, such as location privacy.
References
 [1] C. Dwork, “Differential privacy,” in Proc. ICALP, 2006, pp. 1–12.
 [2] J. C. Duchi, M. I. Jordan, and M. J. Wainwright, “Local privacy and statistical minimax rates,” in Proc. FOCS, 2013, pp. 429–438.
 [3] M. E. Andrés, N. E. Bordenabe, K. Chatzikokolakis, and C. Palamidessi, “Geoindistinguishability: differential privacy for locationbased systems,” in Proc. CCS. ACM, 2013, pp. 901–914.
 [4] Úlfar Erlingsson, V. Pihur, and A. Korolova, “RAPPOR: Randomized aggregatable privacypreserving ordinal response,” in Proc. CCS, 2014, pp. 1054–1067.
 [5] M. Jelasity and K. P. Birman, “Distributional differential privacy for largescale smart metering,” in Proc. IH&MMSec, 2014, pp. 141–146.
 [6] Y. Kawamoto and T. Murakami, “Local obfuscation mechanisms for hiding probability distributions,” in Proc. ESORICS, 2019, to appear.
 [7] J. Geumlek and K. Chaudhuri, “Profilebased privacy for locally private computations,” CoRR, vol. abs/1903.09084, 2019.
 [8] R. F. Barber and J. C. Duchi, “Privacy and statistical risk: Formalisms and minimax bounds,” CoRR, vol. abs/1412.4451, 2014.
 [9] P. Cuff and L. Yu, “Differential privacy as a mutual information constraint,” in Proc. CCS, 2016, pp. 43–54.
 [10] I. Mironov, “Rényi differential privacy,” in Proc. CSF, 2017, pp. 263–275.
 [11] D. Yang, D. Zhang, and B. Qu, “Participatory cultural mapping based on collective behavior data in location based social networks,” ACM Transactions on Intelligent Systems and Technology, vol. 7, no. 3, pp. 30:1–30:23, 2015.
 [12] D. Yang, B. Qu, and P. CudréMauroux, “Privacypreserving social media data publishing for personalized rankingbased recommendation,” IEEE Trans. Knowl. Data Eng., vol. 31, no. 3, pp. 507–520, 2019.
 [13] K. Chatzikokolakis, M. E. Andrés, N. E. Bordenabe, and C. Palamidessi, “Broadening the scope of Differential Privacy using metrics,” in Proc. PETS, 2013, pp. 82–102.
 [14] M. S. Alvim, K. Chatzikokolakis, C. Palamidessi, and A. Pazii, “Invited paper: Local differential privacy on metric spaces: Optimizing the tradeoff with utility,” in Proc. CSF, 2018, pp. 262–267.
 [15] I. Csiszar, “Information measures of difference of probability distributions and indirect observations,” Studia Sci. Math. Hungar., vol. 2, pp. 299–318, 1967.
 [16] S. Kullback and R. A. Leibler, “On information and sufficiency,” Ann. Math. Statist., vol. 22, no. 1, pp. 79–86, 03 1951.
 [17] L. Vaserstein, “Markovian processes on countable space product describing large systems of automata.” Probl. Peredachi Inf., vol. 5, no. 3, pp. 64–72, 1969.

[18]
A. J. Hoffman, “On simple linear programming problems,” in
Convexity: Proceedings of the Seventh Symposium in Pure Mathematics of the American Mathematical Society, 1963, vol. 7, p. 317.  [19] Y. Kawamoto, K. Chatzikokolakis, and C. Palamidessi, “On the compositionality of quantitative information flow,” Log. Methods Comput. Sci., vol. 13, no. 3, 2017.
 [20] G. Barthe and F. Olmedo, “Beyond differential privacy: Composition theorems and relational logic for fdivergences between probabilistic programs,” in Proc. ICALP, ser. LNCS, vol. 7966, 2013, pp. 49–60.
 [21] C. Dwork and G. N. Rothblum, “Concentrated differential privacy,” CoRR, vol. abs/1603.01887, 2016.

[22]
Y. Wang, B. Balle, and S. P. Kasiviswanathan, “Subsampled renyi differential privacy and analytical moments accountant,” in
Proc. AISTATS, 2019, pp. 1226–1235.  [23] D. Kifer and A. Machanavajjhala, “A rigorous and customizable framework for privacy,” in Proc. PODS, 2012, pp. 77–88.
 [24] B. Yang, I. Sato, and H. Nakagawa, “Bayesian differential privacy on correlated data,” in Proc. SIGMOD, 2015, pp. 747–762.
 [25] Z. Jorgensen, T. Yu, and G. Cormode, “Conservative or liberal? Personalized differential privacy,” in Proc. ICDE’15, 2015, pp. 1023–1034.

[26]
T. Murakami and Y. Kawamoto, “Utilityoptimized local differential privacy mechanisms for distribution estimation,” in
Proc. USENIX Security, 2019, to appear.  [27] J. Xu, Z. Zhang, X. Xiao, Y. Yang, G. Yu, and M. Winslett, “Differentially private histogram publication,” VLDB J., vol. 22, no. 6, pp. 797–822, 2013.
Comments
There are no comments yet.