Results
Belief System with Logic Constraints
Friedkin et al.^{13, 12} describe a belief system with logic constraints as a group of agents that periodically exchange and update their opinions about a set of different truth statements with logical dependencies among them. After each social interaction, the agents use shared opinions, as well as underlying logical dependencies among them, to update their beliefs. The agents exchange their opinions by interacting over a social network captured by a graph , where is the set of agents, and is the set of edges. A directed edge towards an agent indicates that it receives the opinion of another agent, i.e., a directed flow of information. Analogously, the logical dependencies among the truth statements are modeled by a graph , where an edge between two statements exists if the belief in one statement affects belief in the other. The generalized dynamics of a belief system are defined as follows. First, every agent aggregates its opinions on every truth statement according to the imposed logic constraints (i.e., modifying the opinions to take into account the dependencies on the other truth statements). The aggregation steps consist of weighted (convex) combinations of the available values, where the weights represent the relative influence. Second, the agents share their opinions over a social network, where the opinions are aggregated again to take into account those coming from the neighboring agents (i.e., social interactions). Finally, a new opinion is formed as a combination of the most recent aggregation and the initial opinion, which models adversity to deviate from the initial beliefs or stubbornness. This model is described in the following equations (1) for an arbitrary agent and an arbitrary statement :
(Aggregation by logic constraints)  (1a)  
(Aggregation by social network)  (1b)  
(Influence of initial beliefs)  (1c) 
where represents the opinion of an agent at time on a certain statement , while and are the intermediate aggregation steps. The opinion of an agent on a specific statement being true or false is modeled by a scalar value between zero and one. A value of zero indicates that the given agent strongly believes a specific statement is false, whereas a value of one indicates that the agent believes the statement is true. Similarly, a value of indicates the maximal uncertainty about a statement. The intermediate aggregated opinion of agent on statement is formed by using the opinions of the same agent about the other statements . The parameters are compliant with the graph that models the logic constraints in the sense that is nonzero if the statement depends on statement , and otherwise . These parameters represent the strength of the logic constraints, i.e., the influence that an opinion on a statement has on the opinion on other statements. Subsequently, the intermediate aggregated opinion of agent on statement is formed by combining all the intermediate opinions of neighboring agents . In this update, the parameters represent the weights that an agent assigns to the information coming from its neighbor , for example is how agent weights the opinions shared by agent . These parameters are compliant with the network in the sense that if there is an incoming edge to agent from agent in the graph, then the corresponding weight is nonzero. The last update in equation (1) indicates that, at time , the new opinion of agent on statement is obtained as a weighted combination of its intermediate aggregated opinion at time and its initial opinion on statement . The parameter that agent uses models its stubbornness. If we say an agent is stubborn, where indicates that the agent is maximally closed to the influence of others. If , agent is said to be maximally open to the influence of others, and oblivious if additionally it is not influenced by stubborn agents. We can group the parameters into an by matrix , known as the social influence structure, and the parameters into an by matrix , known as the multiissues dependent structure^{12}. We assume these matrices are nonnegative. Furthermore, the weights assigned by an agent to its neighbors sum up to one, i.e., the sum of the entries in each row of the matrix is ; likewise, the sum of the entries in each row of the matrix is . Thus, the matrices and are rowstochastic. Figure 1 illustrates a belief system with agents and truth statements, moreover, it gives examples for the choice of the matrices and . Figure 1(c) shows the belief system generated by the network of agents in Fig. 1(a) and the set of logic constraints in Fig. 1(b). This new graph depicted in Fig. 1(c) is much larger than the network of agents or the network of statements taken separately; effectively; it has nodes. The belief of each agent on each truth statement is a separate node; also, the initial beliefs are separate nodes. The model of this larger graph of the belief system can be compactly restated as
(2) 
where is a state that stacks the current beliefs of all agents on all topics along side with the initial beliefs, i.e.,
and
where
is a zero matrix of size
,is an identity matrix of size
, indicates the Kronecker product (see Supplementary Definition 1), is a diagonal matrix with the th diagonal entry being , and denotes the transpose of a vector or matrix . This allows for the definition of the belief system graph , which is compliant with the matrix , where an edge from to exists if . See Supplementary equation (4) for an example of a matrix for the belief system in Fig. 1(c) assuming that for all agents. Figure 2 shows an example where a network of agents forms a cycle graph, given in Fig. 2(a), a set of logic constraints forms a directed path, given in Fig. 2(b), and for all . The belief system graph is shown in Fig. 2(c). Figure 2(d) shows the dynamics of the belief vector as the number of social interactions increases. The opinion on all
topics converges to a single value for all agents. Figure 2(e) shows the dynamics of the belief vector when no logic constraints are considered. In this case, the agents reach some agreement on the final value, but this consensual value is different for each of the statements. See Supplementary Fig. 11 for an additional example of the influence of the logic constraints on the resulting belief system and Supplementary Fig. 12 for a variation of the example discussed in Fig. 2 when the network of agents is a complete graph.When does a Belief System Converge?
The convergence of the belief system can be stated as a question of the existence of a limit of the beliefs, as the social interactions continue with time. That is, whether or not there exists a vector of opinions such that
for any initial value . Friedkin et al.^{13, 12} showed that a belief system with logic constraints will converge to equilibrium if and only if either , or and exists. Moreover, if we represent the matrices and with a block structure as
where is the subgraph of oblivious agents, then the belief system is convergent if and only if and exists. We next consider how these conditions may be interpreted in terms of the topology of the network of agents and the set of logic constraints. The belief system in equation (2) converges to equilibrium if and only if every closed strongly connected component of the graph is aperiodic ^{4, 18}. Recall that a strongly connected component is closed if it has no incoming links from other agents; otherwise, it is called open, see Fig. 3. In general, the set of strongly connected components can be computed efficiently for largecomplex networks^{19}. The matrix has two diagonal blocks, one corresponding to the initial beliefs and one involving the product . The initial belief nodes are aperiodic closed strongly connected components, each consisting of a single node. Therefore, the diagonal block in corresponding to the initial beliefs induces an aperiodic graph. Moreover, strongly connected components with stubborn agents do not affect the convergence of the belief system. Thus, one can focus on the closed strongly connected components of the graph induced by . The product can be written in its block upper triangular form, where each of the blocks in the diagonal is the product of one strongly connected component from the graph induced by and one from (see Supplementary Lemma 2). McAndrew ^{20} showed that the period of a product graph is the lowest common multiple of the periods of the two factor graphs (see Supplementary Definition 2 and Supplementary Theorem 1). If the factor graphs are not coprime, the resulting product graph is a disconnected set of components. Nevertheless, each of the resulting components will have the same period as defined above. Therefore, in order for a product graph to be aperiodic we require the factors to be aperiodic as well. An immediate conclusion drawn from this fact is that the process (2) converges to equilibrium if an only if every closed strongly connected component of the graph is aperiodic and every closed strongly connected component of the graph composed by oblivious agents only is aperiodic. This is a graphtheoretic interpretation of the algebraic criteria derived by Friedkin et al.^{13, 12}. In Fig. 1, the network of agents has a single closed strongly connected component which consists of the node . The network of truth statements also has a single closed strongly connected component, consisting of the node . Thus, the belief system will converge to a set of final beliefs. In Fig. 2
, the belief system has one closed strongly connected component shown in green with the topology of a cycle graph. This strongly connected component corresponds to the product of the cycle graph and the green node of the logic constraints. The cycle graph is aperiodic if and only if the number of nodes is odd. Thus, if the cycle network of agents has an even number of nodes, the belief system will not converge.
How long does a Belief System take to Converge?
We seek to characterize the time required by the process in equation (2) to be arbitrarily close to its limiting value in terms of properties of the graphs and
, such as the number of agents and truth statements, and the topology of the graphs. We provide an estimate on the number of iterations required for the beliefs to be at a distance of at most
from their final value (assuming they converge). This estimate is expressed in terms of the total variation distance, denoted by (for its definition see the section on Methods). For this we define the convergence time as follows:where evolves according to equation (2). Informally, the value shows the minimum number of social interactions required for the belief system to be arbitrarily close to its final value as a function of the initial disagreement. The dynamics of the belief system in equation (2
) are closely related to the dynamics of a Markov chain with a transition matrix
^{21}, specifically, the ergodic properties of a random walk over on the graph . Particularly, consider a random walk on the state space which, at time jumps to a random neighbor of its current state. The relation between a random walk on a graph and the convergence properties of systems of the form of the belief system in (2) has been previously explored in Olshevsky and Tsitsiklis ^{21}. In both cases, we are interested in the convergence properties of as goes to infinity. If there is a limiting distribution for a Markov chain with transition probability , then the belief system converges.Moreover, bounds on the convergence time based on the mixing properties of this Markov chain provide rates of convergence for the belief system. The convergence time of a belief system is proportional to the maximum time required for a random walk, with transition probability matrix
, to get absorbed into a closed strongly connected component plus the time needed for such component to mix sufficiently. Figure 4 illustrates this by considering two random walks and with the same transition matrix. Denote by the maximum expected mixing time among all closed strongly connected components, and by the maximum expected time to get absorbed into a closed component. Assuming the graph , the belief system will be close to its limiting distribution after steps (see Supplementary Theorem 3 for a formal statement and a proof of this result). Therefore, not only do we have an estimate of the convergence time of the belief system in terms of the topology of the graph , but we also know this convergence happens exponentially fast. Additionally, Lemma 2 in the Supplementary material shows that each of the strongly connected components of the graph is the product of two such components, one from the graph and the other from the graph . Moreover, the expected mixing (or absorbing) time for a random walk on a product graph is the maximum of the expected mixing (or absorbing) time of the individual factor graphs (see Supplementary Lemma 4). Thus, we have an explicit characterization of the convergence time in terms of the components of the network of agents and the network of logic constraints. For example, in Fig. 2, the expected absorbing time is of the order of the number of nodes in the path, that is , while the expected mixing time of a cycle graph is of the order of the number of the nodes squared^{22, 23, 24}, which is in this example. Thus, the expected convergence time for the belief system is . Figure 5 depicts simulation results for this bound that demonstrate its validity. In particular, Fig. 5(a) shows how the convergence time changes when the number of nodes in the cycle graph increases, while Fig. 5(b) shows how the convergence time changes when the number of truth statements in the directed path graph increases. Moreover, Fig. 5(c) shows that the convergence to the final beliefs is exponentially fast. Table 1 presents the estimates for the expected convergence time for belief systems composed of wellknown classic graphs, see Supplementary Fig.13 for plots of some of these common graphs. We use the existing results about the mixing time for these graphs (see Supplementary Table 3 for a detailed list of references on each of the studied graphs) to provide an estimate of the convergence time of the resulting belief system when all agents are oblivious. Particularly, our method allows the direct estimation of the dynamics of a belief system when largescale complex networks are involved. For example, we provide convergence time bounds for the case where networks follow random graph models, namely: the geometric random graphs, the ErdősRényi random graphs, and the NewmanWatts smallworld networks. These graphs are usually considered for their ability to represent the behavior of complex networks encountered in a variety of fields^{25, 26, 27, 28} (see Supplementary Fig. 14). Figure 6 shows experimental results for the convergence time of a belief system for a subset of the graphs given in Table 1. For every pair of graphs, we show how the convergence time increases as the number of agents or the number of truth statements change. One can particularly observe the maximumlike behavior on the convergence time as predicted by the theoretical bounds, see Theorem 5 in the Supplementary Material. See Supplementary Fig. 15 and Supplementary Fig. 17 for additional numerical results on other combinations of graphs from Table 1, and Supplementary Fig. 16 and Supplementary Fig. 18 for their linear convergence rates, respectively.Where Does a Belief System Converge?
So far we have discussed the conditions for convergence of a belief system and the corresponding convergence time. Convergence implies the existence of a vector where the set of beliefs settles as the number of interactions increases. Particularly, Proskurnikov and Tempo ^{18} characterize the limiting distribution as a solution of
which can be intractable to compute when the matrices and are large. We are interested in a characterization of this limit vector that admits a rapid computation of its value. The Supplementary Lemma 2 shows that one can always group the nodes in the graph into open and closed strongly connected components. In order to guarantee convergence we assume that every closed strongly connected component is aperiodic. Therefore, take any closed strongly connected component and let be the minor of the matrix obtained by taking into account only the nodes in the set . Then, corresponds to the transition matrix of an irreducible and aperiodic Markov chain with a stationary distribution , where . The vector
is effectively the lefteigenvector of the matrix
corresponding to the eigenvalue
. Let be the vector obtained from the state vector by taking only the components of corresponding to the nodes in the set . Then,where is the cardinality of the set , and is the vector of size with all entries equal to ^{18, 24}. Additionally, recall that every strongly connected component of is the product of two strongly connected components, one from the network of agents and one from the logic constraint network. Thus, for some matrices and (submatrices of and , respectively), which implies that , i.e., the vectors and are the corresponding left eigenvalues of the factor components of associated with the eigenvalue . Therefore, the final beliefs of those nodes in the closed strongly connected component are a weighted average of their initial beliefs, and the weights (sometimes referred to as the social power) are determined by the product of the lefteigenvectors of the factors and . Particularly, the value indicates the limit distribution of a random walk in , that is, it gives the probability that a random walk visits a particular node in after a long time. On the other hand, now consider the set of all open strongly connected components with incoming edges from nodes grouped into a set , in this case, the belief , for , will converge to
where is the probability of absorption of a random walk starting at node into a node with limiting value (see Supplementary Note 3). Therefore, the limiting value of nodes in an open strongly connected components is a convex combination of the limiting values of the nodes it is connected to.
Numerical Analysis of Social Networks
Next, we provide a numerical analysis for the evolution of belief systems with social network structures from largescale networks in the Stanford Network Analysis Project (SNAP)^{29}, see Fig. 7, and logic constraints built from random graph generating models. Random graph generating models, such at the ErdősRényi graphs, the NewmanWatts graph, and the geometric random graphs, have been proposed to model the dynamics and the properties of real largescale complex networks, for example, relatively fast mixing or linear convergence of the beliefs. We use the wikiVote ^{30}, caGrQc ^{31}, and egoFacebook ^{32} graphs as social networks and a binary tree, a NewmanWatts graph, and an ErdősRényi graphs as logic constraints. The wikiVote network represents the aggregation of elections where Wikipedia contributors assign votes to each other to select administrators. This generates a directed social network where the edges are the votes given by the users. The caGrQc network represents the general relativity and quantum cosmology collaboration network for eprints from arXiv. The nodes are composed of authors, and edges represent coauthorship of a manuscript between two authors. Finally, the egoFacebook network represents an anonymized set of Facebook users as nodes and edges indicate friendships among them in the Facebook platform. Table 2 shows the description of the networks used. In the three cases, we select the largest strongly connected component of the graph and use it as a representative of the network structure and the mixing properties of the graph. Furthermore, we assume that the agents use equal weights for all their (in)neighbors. Figure 8 shows the convergence time of a belief system when the network of agents is each the three largescale complex networks described in Table 2. Figure 8 considers a simplified scenario where a single closed strongly connected component composes the social network of agents and the network of logic constraints. Therefore, absorbing time is effectively zero and the mixing time of the belief system is the maximum between the mixing time of the social network and the mixing time of the network of logic constraints. Convergence is guaranteed since both networks are taken to be aperiodic by introducing positive selfweights to every agent. Results show that the predicted maximum type behavior holds; that is, the convergence time of the belief system is upper bounded by the maximum mixing time of a random walk on the graph of agents and the graph of logic constraints. The convergence time remains constant and of the order of the convergence time of the network of agents, until the mixing time of the network formed by the logic constraints is larger. Then, the total convergence time increases based on the specific topology of the graph of logic constraints. Figure 9 shows the exponential convergence rate of the belief system described in Figure 8. Figure 10 shows the cumulative influence of the nodes in each of the graphs, i.e., the weight an ordered subset of the nodes has on the final value of the beliefs. In this case, since we are considering a single strongly connected component, the weights are determined by the lefteigenvalue of the weight matrix corresponding to the eigenvector .
Discussion
In a recent paper, Friedkin et al. ^{13} proposed a new model that integrates logic constraints into the evolution opinions of a group of agents in a belief system. Logic constraints among truth statements have a significant impact on the evolution of opinion dynamics. Such restrictions can be modeled as graphs that represent the favorable or unfavorable influence the beliefs on specific topics have on others. Starting from this context, we have here approached this model from its extended representation of a belief system, where opinions of all agents on all topics as well as their corresponding initial values are nodes in a larger graph. This larger graph is composed of the Kronecker product of the graphs corresponding to the network of agents and the network of logical constraints respectively. In this study, we have provided graphtheoretic arguments for the characterization of the convergence properties of such opinion dynamic models based on extensive existing knowledge of convergence and mixing time of random walks on graphs using the theory of Markov chains. We have shown that convergence occurs if every strongly connected component of the network of logic constraints is aperiodic and every strongly connected component of oblivious agents is aperiodic as well. Moreover, to be arbitrarily close to their limiting value we require time steps. The parameter is the maximum coupling time for a random walk among the closed strongly connected components of the product graph, and is the maximum time required for a random walk, that starts in an open component, to get absorbed by a closed component. Our analysis applies to broad classes of networks of agents and logic constraints for which we have provided bounds regarding the number of nodes in the graphs. Finally, we show that the limiting opinion value is a convex combination of the nodes in the closed strongly connected components and this convergence happens exponentially fast. Our framework offers analytical tools that deepen our abilities for modeling, control and synthesis of complex network systems, mainly humanmade, and can inspire further research in domains where opinion formation and networks interact naturally, such as neuroscience and social sciences. Finally, extending this analysis to other opinion formation models that use different aggregating strategies may require further study of Markov processes and random walks.
Methods
Directed Graphs ^{20}
We define a directed graph as a set of nodes and a set of edges where the elements of
are ordered pairs
with . A path of is a finite sequence such that for . Moreover, define as the number of edges in the path . A pair of nodes are strongly connected if there is a path from to and from to . We say a directed graph is strongly connected if each pair of nodes of are strongly connected. A cycle of a graph is a path such that , i.e., the start and end nodes of the path are the same. We denote the period of a directed graph as , and define it as the greatest common divisor of the length of all cycles in the graph .Random Walks, Mixing and Markov chains
Consider a finite directed graph composed by a set of nodes with a set of edges
and a compliant associated rowstochastic matrix
, called the transition matrix. A random walk on the graphis the event of a token moving from one node to an outneighbor according to some probability distribution determined by the transition matrix. The dynamics of the random walk are modeled a Markov chain
such that with . This Markov chain is called ergodic if it is irreducible and aperiodic. For an ergodic Markov chain, there exists a unique stationary distribution , which describes the probability that a random walk visits a particular node in the graph as the time goes to infinity, that is as . The stationary distribution is invariant for the transition matrix, that is . It follows that the convergence to the stationary distribution of a random walk reduces to analyzing powers of (Theorem in Levin et al.^{24}). The distance to stationarity at a time , i.e., after transitions of the Markov Chain, or steps in the random walk, is defined aswhere is the total variation distance between two probability distributions and , defined as
Moreover, the mixing time of the Markov chain is
and we say the Markov chain has (relatively) rapid mixing if , i.e., polynomial relations in the terms and . Finally, the mixing time can be bounded in terms of the left eigenvalues of the matrix as
(3) 
where is the lefteigenvalue of the transition matrix with the largest abstolute value^{33}.
The Coupling Method
The technical advances in this paper are mostly made by using the coupling method, which is a way to bound the mixing time of Markov chains. Consider two independent Markov chains and , with the same transition matrix . Then, define the coupling time as the smallest such that , that is, . Note that
is a random variable and it depends on
as well as the initial distributions of the processes and . Finally, define the quantity as the maximum expected coupling time of a Markov chain with transition matrix over all possible initial distributions of the processes and , i.e.,In words, this is the maximum expected time it takes for two random walks, with the same transition matrix and arbitrary initial states, to intersect. If we assume starts from a distribution , and from some other arbitrary stochastic vector and we couple the processes and by defining a new process such that
The key insight of the coupling method is that is identically distributed to ; this follows by conditioning on the events and . Therefore, questions about the distribution of can be solved by considering instead. By starting the chain in the stationary distribution, these considerations imply that
because if then ; for more details, see Lindvall ^{34}. Thus, it follows by the Markov inequality that
Setting implies that . Thus, it follows that after steps, it holds that , for any , and being the stationary distribution of the Markov chain. Since ^{24}, the same applies to the quantity . The coupling method is the primary technical tool we use in this work. In Supplementary Note 3, we use the coupling method to bound the convergence time of equation (1) in terms of the coupling times on the underlying social network and on the logic constraint graph. Because coupling time over the Kronecker product is, up to a multiplicative constant, the maximum of the coupling times, this allows us to analyze the effect of the social network and logic constraint graph on convergence time separately.
Network of Agents  Logic Constraints  Maximum Expected 
Convergence Time  
Complete  Directed Path  
Cycle  Directed Path  
Cycle  Path  
Dumbbell Graph  Complete Binary Tree  
d Hypercube  Complete Binary Tree  
d Grid  Star  
d Grid  Two Joined Star  
d Grid  Star  
d Torus  d Grid  
d Torus  Star  
d Torus  d Grid  
Lollipop  Star  
Dumbbell  Star  
Eulerian: degree and expansion  Dumbbell  
Eulerian: degree, maxdegree weights  Dumbbell  
Lazy Eulerian with degree degree  Dumbbell  
Lamplighter on Hypercube  Bolas  
Lamplighter on Torus  Bolas  
Geometric Random on :  Bolas  
Geometric Random:  Bolas  
ErdősRényi: ,  Dumbbell  
ErdősRényi: ,  NewmanWatts  
ErdősRényi: ,  Dumbbell  
ErdősRényi:  Dumbbell  
NewmanWatts : ,  Path  
Expander  Path  
Any Connected Undirected Graph  Expander  
with Metropolis Weights  
Any Connected Undirected Graph  Expander 
Graph  Nodes  Edges  Type  Upper Bound on  Description 

Mixing Time  
wikiVote ^{30}  Directed  Wikipedia whovotesonwhom network  
caGrQc ^{31}  Undirected  Collaboration network of arXiv General Relativity  
egoFacebook^{32}  Undirected  Social circles from Facebook 
References
 1 Converse, P. E. & Apter, D. E. Ideology and its Discontents (Free Press, 1964).
 2 Feldman, S. Structure and consistency in public opinion: The role of core beliefs and values. American Journal of Political Science 416–440 (1988).
 3 Acemoglu, D., Dahleh, M. A., Lobel, I. & Ozdaglar, A. Bayesian learning in social networks. The Review of Economic Studies 78, 1201–1236 (2011).
 4 Jackson, M. O. Social and Economic Networks (Princeton University Press, 2010).
 5 Hegselmann, R. & Krause, U. Opinion dynamics driven by various ways of averaging. Computational Economics 25, 381–405 (2005).
 6 Mirtabatabaei, A. & Bullo, F. Opinion dynamics in heterogeneous networks: convergence conjectures and theorems. SIAM Journal on Control and Optimization 50, 2763–2785 (2012).
 7 Friedkin, N. E. The problem of social control and coordination of complex systems in sociology: A look at the community cleavage problem. IEEE Control Systems 35, 40–51 (2015).
 8 Cartwright, D. E. Studies in Social Power. (Univer. Michigan, 1959).
 9 Friedkin, N. E. & Johnsen, E. C. Social Influence Network Theory: A Sociological Examination of Small Group Dynamics, vol. 33 (Cambridge University Press, 2011).
 10 DeGroot, M. H. Reaching a consensus. Journal of the American Statistical Association 69, 118–121 (1974).
 11 Abelson, R. P. Mathematical models of the distribution of attitudes under controversy. Contributions to Mathematical Psychology 14, 1–160 (1964).
 12 Parsegov, S. E., Proskurnikov, A. V., Tempo, R. & Friedkin, N. E. Novel multidimensional models of opinion dynamics in social networks. IEEE Transactions on Automatic Control (2016).
 13 Friedkin, N. E., Proskurnikov, A. V., Tempo, R. & Parsegov, S. E. Network science on belief system dynamics under logic constraints. Science 354, 321–326 (2016).
 14 Butts, C. T. Why I know but don’t believe. Science 354, 286–287 (2016).
 15 Amblard, F. & Deffuant, G. The role of network topology on extremism propagation with the relative agreement opinion dynamics. Physica A: Statistical Mechanics and its Applications 343, 725–738 (2004).
 16 Fortunato, S. Damage spreading and opinion dynamics on scalefree networks. Physica A: Statistical Mechanics and its Applications 348, 683–690 (2005).
 17 van der Linden, S. Determinants and measurement of climate change risk perception, worry, and concern. In Nisbet, M. et al. (eds.) The Oxford Encyclopedia of Climate Change Communication (Oxford University Press, Oxford, UK, 2017).
 18 Proskurnikov, A. V. & Tempo, R. A tutorial on modeling and analysis of dynamic social networks. part i. Annual Reviews in Control 43, 65 – 79 (2017).
 19 Tarjan, R. Depthfirst search and linear graph algorithms. SIAM Journal on Computing 1, 146–160 (1972).
 20 McAndrew, M. H. On the product of directed graphs. Proceedings of the American Mathematical Society 14, 600–606 (1963).
 21 Olshevsky, A. & Tsitsiklis, J. N. Degree fluctuations and the convergence time of consensus algorithms. In Proc. 50th IEEE Conf. Decision and Control and European Control Conf, 6602–6607 (2011).
 22 Gerencsér, B. Markov chain mixing time on cycles. Stochastic Processes and Their Applications 121, 2553–2570 (2011).
 23 TahbazSalehi, A. & Jadbabaie, A. Small world phenomenon, rapidly mixing Markov chains, and average consensus algorithms. In 46th IEEE Conference on Decision and Control, 276–281 (IEEE, 2007).
 24 Levin, D. A., Peres, Y. & Wilmer, E. L. Markov Chains and Mixing Times (American Mathematical Soc., 2009).
 25 Watts, D. J. Small Worlds: The Dynamics of Networks between Order and Randomness (Princeton University Press, 1999).
 26 Barabasi, A.L. Linked: How Everything is Connected to Everything Else and What It Means (Plume, 2003).
 27 Ganguly, N., Deutsch, A. & Mukherjee, A. Dynamics On and Of Complex Networks: Applications to Biology, Computer Science, and the Social Sciences (Springer, 2009).
 28 Bornholdt, S. & Schuster, H. G. Handbook of Graphs and Networks: from the Genome to the Internet (John Wiley & Sons, 2006).
 29 Leskovec, J. & Krevl, A. SNAP Datasets: Stanford large network dataset collection. http://snap.stanford.edu/data (2014).
 30 Leskovec, J., Huttenlocher, D. & Kleinberg, J. Signed networks in social media. In Proceedings of the SIGCHI Conference on Human fFactors in Computing Systems, 1361–1370 (ACM, 2010).
 31 Leskovec, J., Kleinberg, J. & Faloutsos, C. Graph evolution: Densification and shrinking diameters. ACM Transactions on Knowledge Discovery from Data (TKDD) 1, 2 (2007).
 32 Leskovec, J. & McAuley, J. J. Learning to discover social circles in ego networks. In Advances in Neural Information Processing Systems, 539–547 (2012).
 33 Diaconis, P. & Stroock, D. Geometric bounds for eigenvalues of Markov chains. The Annals of Applied Probability 36–61 (1991).
 34 Lindvall, T. Lectures on the Coupling Method (Courier Corporation, 2002).
 35 Penrose, M. Random Geometric Graphs (Oxford University Press, 2003).
 36 Erdos, P. & Rényi, A. On the evolution of random graphs. Publ. Math. Inst. Hung. Acad. Sci 5, 17–60 (1960).
 37 Newman, M. E. & Watts, D. J. Renormalization group analysis of the smallworld network model. Physics Letters A 263, 341–346 (1999).
 38 Ikeda, S., Kubo, I. & Yamashita, M. The hitting and cover times of random walks on finite graphs using local degree information. Theoretical Computer Science 410, 94–100 (2009).
 39 Beveridge, A. & Wang, M. Exact mixing times for random walks on trees. Graphs and Combinatorics 29, 757–772 (2013).
 40 Kannan, R., Lovász, L. & Montenegro, R. Blocking conductance and mixing in random walks. Combinatorics, Probability and Computing 15, 541–570 (2006).
 41 Aldous, D. & Fill, J. Reversible Markov chains and random walks on graphs (2002).
 42 Avin, C. & Ercal, G. Bounds on the mixing time and partial cover of adhoc and sensor networks. In EWSN, 1–12 (2005).
 43 Chandra, A. K., Raghavan, P., Ruzzo, W. L., Smolensky, R. & Tiwari, P. The electrical resistance of a graph captures its commute and cover times. Computational Complexity 6, 312–340 (1996).
 44 Montenegro, R. The simple random walk and maxdegree walk on a directed graph. Random Structures & Algorithms 34, 395–407 (2009).
 45 Boczkowski, L., Peres, Y. & Sousi, P. Sensitivity of mixing times in Eulerian digraphs. arXiv preprint arXiv:1603.05639 (2016).
 46 Boyd, S. P., Ghosh, A., Prabhakar, B. & Shah, D. Mixing times for random walks on geometric random graphs. In ALENEX/ANALCO, 240–249 (2005).
 47 Avin, C. & Ercal, G. On the cover time and mixing time of random geometric graphs. Theoretical Computer Science 380, 2–22 (2007).
 48 Benjamini, I., Kozma, G. & Wormald, N. The mixing time of the giant component of a random graph. Random Structures & Algorithms 45, 383–407 (2014).
 49 Fountoulakis, N. & Reed, B. The evolution of the mixing rate. arXiv preprint math/0701474 (2007).
 50 Ding, J., Kim, J. H., Lubetzky, E. & Peres, Y. Anatomy of a young giant component in the random graph. Random Structures & Algorithms 39, 139–178 (2011).
 51 Ding, J., Lubetzky, E., Peres, Y. et al. Mixing time of nearcritical random graphs. The Annals of Probability 40, 979–1008 (2012).
 52 Nachmias, A. & Peres, Y. Critical random graphs: diameter and mixing time. The Annals of Probability 1267–1286 (2008).
 53 AddarioBerry, L. & Lei, T. The mixing time of the NewmanWatts smallworld model. Advances in Applied Probability 47, 37–56 (2015).
 54 Durrett, R. Random Graph Dynamics (Cambridge University Press, UK, 2007).
 55 Olshevsky, A. Linear time average consensus and distributed optimization on fixed graphs. SIAM Journal on Control and Optimization 55, 3990–4014 (2017).
 56 Weichsel, P. M. The Kronecker product of graphs. Proceedings of the American Mathematical Society 13, 47–52 (1962).
 57 Cormen, T. H., Leiserson, C. E., Rivest, R. L. & Stein, C. Introduction to Algorithms, Third Edition (The MIT Press, 2009), 3rd edn.
 58 Kemeny, J. G. & Snell, J. L. Finite Markov chains (Springer, 1983).
Acknowledgements
This research is supported partially by the National Science Foundation under grants no. CPS 1544953 and no. CMMI1463262, and by the Office of Naval Research under grant no. N000141210998.
Authors Contributions
AN, AO, and CAU conceived the project, derived the analytical results and wrote the manuscript. CAU performed the numerical simulations and analyzed the data.
Additional Information
The authors declare that they have no competing financial interests. Correspondence and requests for materials should be addressed to CAU (cauribe@mit.edu).
Supplementary Material
Supplementary Equation
(4) 
Network Topology  Mixing Time 

Complete  
Cycle  
Path ^{38, 39}  
Star Graph ^{39}  
Two Joined Star Graphs  
Dumbbell Graph ^{40}  
Lollipop ^{41}  
Bolas Graph ^{41}  
Complete Binary Tree ^{24}Section  
d Hypercube ^{24}Section  
d Grid ^{42, 43}  
d Grid ^{42, 43}  
d Grid ^{42, 43}  
d Torus ^{24}Section  
d Torus ^{24}Section  
d Torus ^{24}Section  
Eulerian Graph ^{44}  
Lazy Eulerian with degree degree ^{45}  
Eulerian: degree, maxdegree weights and expansion ^{44}  
Geometric Random Graph: ^{46}  
Geometric Random Graph: ^{47}  
ErdősRényi: , ^{48, 49}  
ErdősRényi: , ^{50, 51}  
ErdősRényi: ^{52}  
NewmanWatts (smallworld) Graph ^{53}  
Expander Graph ^{54}  
Any Connected Undirected Graph with Metropolis weights ^{55}  
Any Connected Graph 
Supplementary Note 1: The Kronecker Product of Graphs
In this note, we define the Kronecker product of two matrices and the Kronecker product of two graphs. Also, we show some of the properties we will use in the proof of our main results regarding convergence, convergence time and limiting value of belief systems.
Definition 1.
^{56} Let be a matrix, and be a matrix, the Kronecker product is the matrix defined as:
or explicitly
Next, we will enumerate some useful properties of the Kronecker product.

Bilinearity and associativity: for matrices , and , and a scalar , it holds:

NonCommutative: In general . However, there exist commutation matrices and such that:
and if and are square matrices then .

Mixedproduct property: for matrices , , and :
Next, we introduce the Kronecker product of graphs and some of its properties.
Definition 2 (Definition 1 in Weichsel^{56}).
The Kronecker (also known as categorical, direct, cardinal, relational, tensor, weak direct or conjunction) product
of two graphs and is a graph where ; and if and only if and . Moreover, the adjacency matrix of the graph is the Kronecker product of the adjacency matrices of and .Theorem 1 (Theorem 1 in McAndrew ^{20}).
Let and be strongly connected graphs. Let , , and . Then, the number of components in is . Moreover, for any component of , .
Supplementary Note 2: Main Technical Results
We now describe the proof of our main result, namely the number of interactions required for a belief system to be arbitrarily close to its limiting set of beliefs. We start with a technical lemma about the strongly connected components of the product of two graphs.
Lemma 2.
Given two graphs and , every strongly connected component of the Kronecker product graph is the result of the Kronecker product of a strongly connected component of and a strongly connected component of .
Proof.
Let and denote the adjacency matrices for the graphs and , respectively. We can construct a condensation of the graph by contracting every strongly connected component to a single vertex, resulting in a directed acyclic graph. Thus, a topological ordering is possible (see Cormen et al. ^{57} Section ) and there always exists two permutation matrices and such that we can rearrange the matrices and into a block upper triangular form where each of the blocks is a strongly connected component, that is
Moreover, define and by the properties of the Kronecker product, cf., Definition 1, it follows that
where is also a permutation matrix and
Comments
There are no comments yet.