A Privacy Preserving Randomized Gossip Algorithm via Controlled Noise Insertion

01/27/2019 ∙ by Filip Hanzely, et al. ∙ 0

In this work we present a randomized gossip algorithm for solving the average consensus problem while at the same time protecting the information about the initial private values stored at the nodes. We give iteration complexity bounds for the method and perform extensive numerical experiments.

READ FULL TEXT VIEW PDF
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

1 Introduction

In this paper we consider the average consensus (AC) problem. Let be an undirected connected network with node set and edges such that . Each node “knows” a private value . The goal of AC is for every node of the network to compute the average of these values, , in a distributed fashion. That is, the exchange of information can only occur between connected nodes (neighbours).

The literature on distributed protocols for solving the average consensus problem is vast and has long history tsitsiklis1984problems ; tsitsiklis1986distributed ; bertsekas1989parallel ; kempe2003gossip . In this work we focus on one of the most popular class of methods for solving the average consensus, the randomized gossip algorithms and propose a gossip algorithm for protecting the information of the initial values , in the case when these may be sensitive. In particular, we develop and analyze a privacy preserving variant of the randomized pairwise gossip algorithm (“randomly pick an edge and then replace the values stored at vertices and by their average”) first proposed in boyd2006randomized for solving the average consensus problem. While we shall not formalize the notion of privacy preservation in this work, it will be intuitively clear that our methods indeed make it harder for nodes to infer information about the private values of other nodes, which might be useful in practice.

1.1 Related Work on Privacy Preserving Average Consensus

The introduction of notions of privacy within the AC problem is relatively recent in the literature, and the existing works consider two different ideas. In huang2012differentially , the concept of differential privacy dwork2014algorithmic is used to protect the output value computed by all nodes. In this work, an exponentially decaying Laplacian noise is added to the consensus computation. This notion of privacy refers to protection of the final average, and formal guarantees are provided. A different line of work with a more stricter goal is the design of privacy-preserving average consensus protocols that guarantee protection of the initial values of the nodes nozari2017differentially ; manitara2013privacy ; mo2017privacy . In this setting each node should be unable to infer a lot about the initial values of any other node. In the existing works, this is mainly achieved with the clever addition of noise through the iterative procedure that guarantees preservation of privacy and at the same time converges to the exact average. We shall however mention, that none of these works address any specific notion of privacy (no clear measure of privacy is presented) and it is still not clear how the formal concept of differential privacy dwork2014algorithmic can be applied in this setting.

1.2 Main Contributions

In this work, we present the first randomized gossip algorithm for solving the Average Consensus problem while at the same time protecting the information about the initial values. To the best of our knowledge, this work is the first which combines the gossip-asynchronous framework with the privacy concept of protection of the initial values. Note that all the previously mentioned privacy preserving average consensus papers propose protocols which work on the synchronous setting (all nodes update their values simultaneously).

The convergence analsysis of proposed gossip protocol (Algorithm 1) is dual in nature. The dual approach is explained in detail in Section 2. It was first proposed for solving linear systems in SDA ; loizou2017momentum and then extended to the concept of average consensus problems in LoizouRichtarik ; loizou2018accelerated . The dual updates immediately correspond to updates of the primal variables, via an affine mapping.

Algorithm 1 is inspired by the works of manitara2013privacy ; mo2017privacy , and protects the initial values by inserting noise in the process. Broadly speaking, in each iteration, each of the sampled nodes first adds a noise to its current value, and an average is computed afterward. Convergence is guaranteed due to the correlation in the noise across iterations. Each node remembers the noise it added last time it was sampled, and in the following iteration, the previously added noise is first subtracted, and a fresh noise of smaller magnitude is added. Empirically, the protection of initial values is provided by first injecting noise into the system, which propagates across the network, but is gradually withdrawn to ensure convergence to the true average.

2 Technical Preliminaries

Primal and Dual Problems

Consider solving the (primal) problem of projecting a given vector

onto the solution space of a linear system:

(1)

where , , . We assume the problem is feasible, i.e., that the system is consistent. With the above optimization problem we associate the dual problem

(2)

The dual is an unconstrained concave (but not necessarily strongly concave) quadratic maximization problem. It can be seen that as soon as the system is feasible, the dual problem is bounded. Moreover, all bounded concave quadratics in can be written in the as for some matrix and vectors and (up to an additive constant).

With any dual vector we associate the primal vector via an affine transformation, It can be shown that if is dual optimal, then is primal optimal. Hence, any dual algorithm producing a sequence of dual variables gives rise to a corresponding primal algorithm producing the sequence . See SDA ; loizou2017momentum for the correspondence between primal and dual methods.

Randomized Gossip Setup: Choosing .

In the gossip framework we wish to be an average consensus (AC) system.

Definition 1.

(LoizouRichtarik ) Let be an undirected graph with and . Let be a real matrix with columns. The linear system is an “average consensus (AC) system” for graph if iff for all .

In the rest of this paper we focus on a specific AC system; one in which the matrix is the incidence matrix of the graph (see Model 1 in SDA ). In particular, we let be the matrix defined as follows. Row of is given by , and if . Notice that the system encodes the constraints for all , as desired. It is also known that randomized Kaczmarz method RK ; gower2015randomized ; loizou2017linearly applied to Problem 1 is equivalent to randomized gossip algorithm (see LoizouRichtarik ; loizou2018accelerated ; loizou2018provably for more details).

3 Private Gossip via Controlled Noise Insertion

In this section, we present the Gossip algorithm with Controlled Noise Insertion. As mentioned in the introduction, the approach is similar to the technique proposed in manitara2013privacy ; mo2017privacy . Those works, however, address only algorithms in the synchronous setting, while our work is the first to use this idea in the asynchronous setting. Unlike the above, we provide finite time convergence guarantees and allow each node to add the noise differently, which yields a stronger result.

In our approach, each node adds noise to the computation independently of all other nodes. However, the noise added is correlated between iterations for each node. We assume that every node owns two parameters — the initial magnitude of the generated noise and rate of decay of the noise . The node inserts noise to the system every time that an edge corresponding to the node was chosen, where variable carries an information how many times the noise was added to the system in the past by node . Thus, if we denote by the current number of iterations, we have .

In order to ensure convergence to the optimal solution, we need to choose a specific structure of the noise in order to guarantee the mean of the values converges to the initial mean. In particular, in each iteration a node is selected, we subtract the noise that was added last time, and add a fresh noise with smaller magnitude: where and for all iteration counters is independent to all other randomness in the algorithm. This ensures that all noise added initially is gradually withdrawn from the whole network.

After the addition of noise, a standard Gossip update is made, which sets the values of sampled nodes to their average. Hence, we have as desired.

It is not the purpose of this paper to define any quantifiable notion of protection of the initial values formally. However, we note that it is likely the case that the protection of private value will be stronger for bigger and for closer to .

Input: vector of private values

; initial variances

and variance decrease rate such that for all nodes .
Initialize: Set ; , .
for  do
      
  1. Choose edge uniformly at random

  2. Generate and

  3. Set and

  4. Update the primal variable: ,

  5. Set and

end for
return
Algorithm 1 Privacy Preserving Gossip Algorithm via Controlled Noise Insertion

We now provide results of dual analysis of Algorithm 1.

Theorem 2.

Let us define and where stands for algebraic connectivity of and denotes the degree of node . Then for all we have the following bound

Note that is a weighted sum of -th powers of real numbers smaller than one. For large enough , this quantity will depend on the largest of these numbers. This brings us to define as the set of indices for which the quantity is maximized: Then for any we have

which means that increasing for will not substantially influence convergence rate. Note that as soon as we have

(3)

for all , the rate from theorem 2 will be driven by (as ) and we will have . One can think of the above as a threshold: if there is such that is large enough so that the inequality (3) does not hold, the convergence rate is driven by . Otherwise, the rate is not influenced by the insertion of noise. Thus, in theory, we do not pay anything in terms of performance as long as we do not hit the threshold. One might be interested in choosing so that the threshold is attained for all , and thus . This motivates the following result:

Corollary 3.

Let us choose for all , where . Then

As a consequence, is the largest decrease rate of noise for node such that the guaranteed convergence rate of the algorithm is not violated.

4 Experiments

In this section we present a preliminary experiment (for more experiments see Section LABEL:moreExp, in the Appendix) to evaluate the performance of the Algorithm 1 for solving the Average Consensus problem. The algorithm has two different parameters for each node . These are the initial variance and the rate of decay, , of the noise.

In this experiment we use two popular graph topologies the cycle graph (ring network) with nodes and the random geometric graph with nodes and radius .

In particular, we run Algorithm 1 with for all , and set for all and some . We study the effect of varying the value of on the convergence of the algorithm.

In Figure 1 we see that for small values of , we eventually recover the same rate of linear convergence as the Standard Pairwise Gossip algorithm (Baseline) of boyd2006randomized . If the value of is sufficiently close to however, the rate is driven by the noise and not by the convergence of the Standard Gossip algorithm. This value is for cycle graph, and for the random geometric graph in the plots we present.

(a)
(b)
Figure 1: Convergence of Algorithm 1, on the cycle graph (left) and random geometric graph (right) for different values of . The “Relative Error " on the vertical axis represents the

References