I Introduction
This paper is concerned with the development of a distributed algorithm for enabling a group of autonomous agents to solve certain types of nonlinear equations over a timevarying network. The type of equations to which we are referring are described by the system
(1) 
where and , . It is assumed that at least one solution to (1) exists, {i.e., the have at least one common fixed point} and that for , agent knows . Each agent
has a time dependent state vector
taking values in , which is its estimate of a common fixed point. It is assumed that each agent can receive information from its neighbors. Specifically, agent receives the vector at time if agent is a neighbor of agent at time . We write for set of the labels of agent ’s neighbors at time , and we always take agent to be a neighbor of itself. Neighbor relations at time can be conveniently characterized by a directed neighbor graph with vertices and a set of arcs defined so that there is an arc in from vertex to vertex just in case agent is a neighbor of agent at time . As each agent is a neighbor of itself, the neighbor graph has selfarcs at each vertex. In general terms, the problem of interest is to develop algorithms, one for each agent, which will enable all agents to iteratively compute a common fixed point of all of the . This paper focuses on nonlinear maps which are “paracontractions.” A continuous nonlinear map is a paracontraction with respect to a given norm on , if for all satisfying and all satisfying [1]. In most applications, paracontractions have multiple fixed points. The concept of a paracontraction has been used in a system theoretic framework to study variants of the classical consensus problem [2, 3, 4].Motivation for this problem stems, in part, from [5] which deals with the problem of devising a distributed algorithm for finding a solution to the linear equation assuming the equation has at least one solution and agent knows a pair of the matrices where and . Assuming each has linearly independent rows, one local update rule for solving this problem in discrete time is of the form
(2) 
where is the affine linear map ,
(3) 
and is the number of labels in [6]. The map is an example of a paracontraction with respect to the 2norm on . To understand why this is so, note that if and only if and for any such , where is the orthogonal projection matrix . For any satisfying , the inequality is equivalent to and so whenever and . But for such and , . Since , is a paracontraction as claimed.
There are many other examples of paracontractions discussed in the literature [1, 7]. Each of the following examples is a paracontraction with respect to the 2norm on .

The orthogonal projector associated with a nonempty closed convex set . This been used for a number of applications including the constrained consensus problem in [8]. The fixed points of this map are vectors in .

The gradient descent map where is convex and differentiable, is Lipschitz continuous with parameter , and is a constant satisfying . The fixed points of this map are vectors in which minimize [9].
Ii Paracontractions
In this section, we review several basic properties of paracontractions. Perhaps the most important is the following wellknown theorem published in [1].
Theorem 1
Suppose is a finite set of paracontractions with respect to some given norm on . Suppose that all of the paracontractions share at least one common fixed point. Suppose that is a sequence of paracontractions from . Then the state of the iteration
converges to a point which is a common fixed point of those paracontractions which occur in the sequence infinitely often.
A number of classical results may be easily established by straightforward application of this theorem, such as the convergence proof for the method of alternating {or cyclic} projections [1].
Below, certain useful propositions associated with paracontractions are described. The proofs of these propositions may be found in the appendix. Similar propositions can also be found in Chapter 4 of [12]. In the following, the set of fixed points of a map is denoted by . Additionally the composition of two maps and is denoted by .
Proposition 1
Suppose and are each paracontractions with respect to same norm . Suppose and share at least one common fixed point, or in other words, . Then the composition is a paracontraction with respect to . Moreover, .
It turns out that the set of fixed points of a paracontraction must be both closed and convex:
Proposition 2
Suppose is a paracontraction. Then is closed and convex.
Recall that for a paracontraction , it must be the case that for all and all . This property is referred to by a number of different names throughout the literature, such as ‘strictly quasinonexpansive.’ Our previous definition of a paracontraction also requires that the map be continuous. So, a paracontraction is a continuous, strictly quasinonexpansive map. One obvious consequence of the property above is that for all and all . Maps which satisfy this condition are called quasinonexpansive. So, any map which is a paracontraction must also be quasinonexpansive. This fact will prove useful in the analysis to follow.
Proposition 3
Suppose is a linear map and is some norm on . Then, is quasinonexpansive with respect to if and only if it is nonexpansive with respect to . Moreover, is a paracontraction with respect to if and only if for any .
Iii The Problem and Main Result
The specific problem to which this paper is addressed is this. Let be a set of paracontractions with respect to the same norm . Suppose that all of the paracontractions share at least one common fixed point. Find conditions so that the states of the iterations
(4) 
all converge to the same point as , and that point is a common fixed point of the , where and are as defined earlier.
To state the main result of this paper, it is necessary to define certain concepts for sequences of directed graphs. To begin, we write for the set of all directed graphs with vertices. By the composition of two directed graphs and with the same vertex set, written , is meant that directed graph with the same vertex set and arc set defined so that is an arc in the composition whenever there is a vertex such that is an arc in and is an arc in . The definition of graph composition extends unambiguously to any finite sequence of directed graphs with the same vertex set. We say that an infinite sequence of graphs in is repeatedly jointly strongly connected, if for some finite positive integers and and each integer , the composed graph is strongly connected. The main result of this paper is as follows:
Theorem 2
Let be a set of paracontractions with respect to the norm on (for some satisfying ). Suppose the maps share at least one common fixed point. Suppose also that the sequence of neighbor graphs is repeatedly jointly strongly connected. Then the states of the iterations defined by (4) all converge to the same point as , and this point is common fixed point of the .
A result similar to Theorem 2 was previously described in [13], but required each , to be strongly connected. This paper extends that result to sequences of neighbor graphs which are repeatedly jointly strongly connected, and presents a special case for which the convergence analysis is simple and instructive.
From the analysis which follows, it will be obvious that this result also applies to iterations of the more general form
(5) 
where are nonnegative realvalued weights from a finite set, and for each and , and if and if . As will be seen in the sequel, the analysis which follows depends critically on there being only finitely many such weights.
It is interesting to note that the standard graphical condition for convergence of a consensus process [14], namely ‘repeatedly jointly rooted,’ is not sufficient to ensure convergence for the problem considered in this paper. Consider a simple counterexample in which , and are orthogonal projectors onto two convex sets and which share a common point, and the neighbor graphs are all equal and have self arcs for agents 1 and 2 as well as an arc from agent 1 to agent 2. Each neighbor graph in this sequence is rooted (as defined in [14]), and so the sequence is repeatedly jointly rooted. Suppose the weights are constant , , , for . Suppose further but . In this case, it is clear from (4) that , , but since , it follows that , . Therefore, , is constant and cannot converge to vector in . In fact, if the are the affine linear maps discussed in (2) then the repeatedly jointly strongly connected condition of Theorem 2 is actually necessary for convergence, provided the convergence is required to be be exponential [5].
The remainder of this paper is devoted to a proof of Theorem 2.
Iv Analysis
This section is organized as follows. First, in subsection IVA, the iterations (4) are written as a single iteration using stacked vectors in . In subsection IVB, Theorem 2 is shown under a special case for which the analysis is straightforward, Finally, in subsection IVC, Theorem 2 is shown under the general case.
In Theorem 2, the sequence of neighbor graphs is assumed to be repeatedly jointly strongly connected. This condition is used in the proofs below to show that products of stochastic matrices meet certain conditions. For an matrix with nonnegative entries, we associate the vertex directed graph defined so that is an arc from to in the graph just in case the th entry of is nonzero. Graph composition and matrix multiplication are closely related. Indeed, composition is defined so that for any pair of nonnegative matrices , , with graphs , , . The graph of the product of two nonnegative matrices is equal to the composition of the graphs of the two matrices comprising the product. In other words, .
Iva Combined Iteration
To proceed, let us note that the family of iterations given by (4) may be written as a single iteration of the form
(6) 
where for any set of vectors , is the stacked vector
(7) 
is the map
(8) 
is an stochastic matrix whose th entry is if and if , is the identity matrix, and is the Kronecker product of with .
It is clear from the definition of that the set of fixed points of is . In the sequel, we use to denote the consensus set, . Note that the intersection of these sets is . In words, if is a vector in , then each of its subvectors are equal and each subvector is a common fixed point of the maps . The set is nonempty if the maps share at least one common fixed point. To prove convergence of the states to the same common fixed point of the paracontractions, it suffices to show that converges to a vector in .
The analysis in the sequel will involve the use of Theorem 1 with maps which are shown to be paracontractions with respect to a ‘mixed vector norm’, , which we define for stacked vectors in of the form as in (7). For a norm on and a norm on , the mixed vector norm on is defined as follows:
This is a ‘norm of norms,’ first taking the norm of each subvector , , and then taking the norm of a vector consisting of those norms.
IvB Special Case
With certain additional but somewhat restrictive assumptions, the proof of Theorem 2 turns out to be a simple application of Theorem 1. Toward this end, suppose each is a paracontraction with respect to and each matrix is doubly stochastic. What makes this special case much simpler than the analysis for the general case is the fact that, with these assumptions, both and are paracontractions with respect to , as shown below.
Proposition 4
Suppose each map is a paracontraction with respect to . Then the map as defined by (8) is a paracontraction with respect to .
Proof: First, note that is continuous since each is continuous. Next, suppose and . So for , and there is some so that . Since each , is a paracontraction with respect to , it follows that for each . Additionally, since , , and is a paracontraction. As a result, . Consequently,
(9) 
Thus, is a paracontraction with respect to .
Proposition 5
Suppose is an doubly stochastic matrix with positive diagonal entries. Then is a paracontraction with respect to .
Proposition 5 is a simple consequence of Lemma 1, which applies to all doubly stochastic matrices with positive diagonal entries.
Lemma 1
Suppose is an doubly stochastic matrix with positive diagonal entries. Then is a paracontraction with respect to .
Proof: To begin, suppose the graph of , , contains disjoint weakly connected components. Let be a permutation matrix such that , where each is an doubly stochastic matrix with positive diagonal entries and a weakly connected graph, and are positive integers such that .
Note that . Since each is a doubly stochastic matrix with positive diagonal entries, each is also a doubly stochastic matrix with positive diagonal entries. Additionally, since each is a weakly connected graph with selfarcs at each vertex, and , it follows that each is a strongly connected graph and selfarcs at each vertex. Thus, each is primitive, and so
must have an eigenvalue at
of multiplicity , and all other eigenvalues must have have magnitude less than . As is similar to , it must also have an eigenvalue at of multiplicity , and all other eigenvalues must have have magnitude less than .Next, we claim that if then . Suppose , or in other words, . It follows that . Using the PerronFrobenius Theorem, , where each is some real value and the corresponding is the vector of all ones in .. As each is a stochastic matrix, it follows that , and therefore, , or in other words .
Now, suppose . From the previous claim, . Therefore, and so
is not an eigenvector of
associated with the eigenvalue 1. But recall that all other eigenvalues have magnitude less than 1. Consequently, as is symmetric. Thus, and by Proposition 3, is a paracontraction with respect to .Proof of Proposition 5: From the definition of it is not difficult to see that for any vector , . Additionally, is a doubly stochastic matrix with positive diagonal entries. So, this proposition follows directly from Lemma 1.
With these preceding tools, we may now prove our main result in this special case. Here, for simplicity, we also assume that each neighbor graph is strongly connected.
Proof of Theorem 2 for Special Case: Each neighbor graph in the sequence has self arcs at each vertex because each agent is assumed to be a neighbor of itself. So each matrix must have positive diagonal entries. From Proposition 4 and Proposition 5, each , and is a paracontraction with respect to . Since there is at least one common fixed point of the maps , the maps and , must share at least one common fixed point. By Proposition 1, each , is a paracontraction and for each . Note that there are only a finite number of such composed maps, since the entries of each , namely , may take only a finite number of possible values. Applying Theorem 1 to the iteration defined by (6), ensures that will converge to a fixed point in the intersection of the sets of fixed points of those which occur infinitely often. Since , the must converge to a vector which is in and also in for those which occur infinitely often. However, under the assumption that each each neighbor graph is strongly connected, the graph of each is strongly connected. By the PerronFrobenius Theorem, it follows that for . So, regardless of which particular occur infinitely often, must converge to a vector in .
It is not difficult to relax the assumption that each neighbor graph is strongly connected to the more general condition that the sequence of neighbor graphs is repeatedly jointly strongly connected, as in Theorem 2. The key step is to prove that the intersection of the sets of fixed points of those which occur infinitely often is just the consensus set, . In fact, this can be shown under a graphical condition even weaker than repeatedly jointly strongly connected. The definition of repeatedly jointly strongly connected includes a uniformity condition which requires that, for some fixed finite number , each successive composition of graphs is strongly connected. Instead, the main result can be shown even if this number varies with each successive composition of graphs.
While the above proof is a straightforward application of Theorem 2, it does not generalize to the case in which the matrices are not doubly stochastic matrices. Instead, it has proven necessary to reason about composed maps of sufficient length and prove that those maps are paracontractions with the requisite set of fixed points, as discussed in the following section.
IvC General Case
There are two differences which make the proof of Theorem 2 more challenging than the analysis previously presented for the special case. First, while the matrices are stochastic, they need not be doubly stochastic, so Proposition 5 may not apply. In order to establish convergence with stochastic matrices without requiring these matrices be doubly stochastic, it has proven useful to instead focus on the norm, or more precisely . However, this leads to a second difficulty. Unlike for the case with as shown in Proposition 4, even if each is a paracontraction with respect to some norm , need not be a paracontraction with respect to .
As an example, consider , , and so that . Suppose that and so that . Suppose also that , so that
(10) 
Additionally,
since is a paracontraction. Because , and consequently From this, it follows that
(11) 
Using (10), (11), and and it follows that
(12) 
Thus, . So, in this case, is not a paracontraction with respect to . However, the map is always quasinonexpansive in this norm.
Proposition 6
Suppose each is a paracontraction with respect to . Let be the map as defined in (8). Then is quasinonexpansive with respect to .
Proof: Suppose and . So for . Since each , is a paracontraction with respect to , it follows that , . As a result,
(13) 
and therefore . Thus, is a paracontraction with respect to .
A similar result follows for stochastic matrices. While the following results are stated for matrices of the form with respect to , by taking the dimension of the identity matrix to be , these results also apply to general stochastic matrices with respect to .
Proposition 7
Suppose is an stochastic matrix. Then is quasinonexpansive with respect to .
Proof: Suppose . By the triangle inequality, for each . But since is stochastic, for each . Thus,
Therefore for any , so is nonexpansive with respect to , By Proposition 3, is quasinonexpansive with respect to .
Proposition 8
Suppose is an positive stochastic matrix. Then for any real value satisfying , is a paracontraction with respect to .
Proof: Since is positive, by Perron’s Theorem, the set of fixed points of the map is the consensus set, . Let be any vector in which is not a fixed point of . Then there must exist integers and such that . Suppose first that is a scalar multiple of ; i.e. for some scalar . Without loss of generality assume , so . Clearly and for all , Then for each ,
This strict inequality holds because is positive, which ensures that . But because is stochastic so
(14) 
Now suppose that is not a scalar multiple of . Then for each , is not a scalar multiple of . By Minkowski’s inequality, since and are both positive. So
(15) 
By the triangle inequality,
Thus using (15),
so (14) holds for this case as well. But
so
(16) 
So, from Proposition 3, is a paracontraction as claimed.
The previous condition that be a positive stochastic matrix for to be a paracontraction is rather strong. In a certain sense, this is a necessary condition as well. (See Proposition 3.6 of [15] for a related statement characterizing the complexvalued matrices which are paracontractions with respect to .)
Proposition 9
Suppose is an stochastic matrix and assume that the set of fixed points of is . If is a paracontraction with respect to , then is a positive matrix.
Proof: Suppose is a paracontraction with respect to and . Assume, to the contrary, that is not a positive matrix, which means there must be indices such that . Consider the stacked vector whose subvectors are given by
where is any vector in such that . Note that is not a fixed point of since not all subvectors are equal, and so . Now, consider the th subvector of ,
Therefore,
But, , so . However, since , from Proposition 3 this contradicts the assumption that is a paracontraction with respect to .
One approach to prove the main result would be to require that each be a positive matrix. This is far too restrictive as it would correspond to the requirement that each of the neighbor graphs
Comments
There are no comments yet.