Appendix
8. Compute Stationary Probabilities. The vector of stationary probabilities is the eigenvector of corresponding to the eigenvalue . Since the underlying graph is strongly connected, the Perron Frobenius theory guarantees eigenvalue is simple. The number of other eigenvalues of modulus 1 is equal to the periodicity of the graph or random walk. For instance, a bipartite graph will have an eigenvalue . If per is the period of the graph and we use vectors in the following algorithm then the algorithm is guaranteed to converge at a rate bounded by [29] since is known and is a simple eigenvalue of largest modulus.
Algorithm 3. Modified Subspace Iteration. [29]
Input: matrix
Output: eigenvalue corresponding to eigenvalue 1.

Set , where is all nonnegative.

Repeat until convergence:

Set .

Compute Schur Decomposition with the diagonal entries of ordered to put the entry closest to 1 in the 1,1 position.

Set . Ensure the first column is all nonnegative (flipping signs of rows of to make the first column all nonnegative, if necessary).


Return .
9. Restarted GMRES. The heart of the computation of the pseudoinverse is the use of Lemma On Fast Computation of Directed Graph Laplacian PseudoInversea to convert a pseudoinverse computation to an ordinary inverse computation. The restarted GMRES algorithm has received much attention in the literature (see [26] and references therein) with many enhancements for numerical stability that do not impact the cost by more than a constant factor. For the purposes of showing the overall cost of the algorithm, we show a simplified sketch of the basic algorithm. By using restarted GMRES we bound the cost of each iteration.
Algorithm 4. Arnoldibased Restarted GMRES.
Input: Matrix , right hand side ,
hyperparameters: restart count , outer iteration limit maxit, tolerance
tol, initial vector .
Output: solution such that .

Compute

For :

Set and set .

If , return solution .

Generate orthonormal Arnoldi basis for the Krylov space
, and upper Hessenberg matrix such that . 
Compute .

Set

The cost of one outer step 2 of restarted GMRES is [26]. Here is the cost of one matrix vector product involving sparse matrix
. This takes one floating multiply and one floating add for each nonzero matrix element. So the cost is
. The number of outer iterations required is controlled by the eigenstructure of the symmetric part , which is related to the separability of the underlying graph [6]. Note step (d) is an least squares problem costing to solve, due to the special Hessenberg structure of .Regarding the number of GMRES iterations, we have the following bound which yields Theorem On Fast Computation of Directed Graph Laplacian PseudoInverse as an immediate consequence.
Theorem 5. [11, 10, 21, 17], Let be a matrix such that is Hermitian positive definite and let denote the smallest eigenvalue for . The residual obtained by GMRES [27] after steps applied to the linear system satisfies
We give a sketch of a proof, referring to to [11, 10, 21, 17] for detailed proofs, including several tighter bounds. First we need the following Lemma
Lemma 6. Let and with be given. Let . Then
Proof. is the value achieving the minimum in the scalar least squares problem and hence satisfies the Galerkin condition . So we have
where . A well known result on field of values for any matrix whose Hermitian part is positive definite is the inequality [16]
Hence the first inequality (Appendix) follows. The remaining inequality follows from the identity
Inverting both sides and taking norms yields
Sketch of Proof of Theorem Appendix.. GMRES on a matrix with initial residual will find in steps a solution with a residual satisfying , where is the set of all polynomials of degree up to satisfying . In particular, after a single step , where . Hence we have the bound from Lemma Appendix: . This amounts to a single step of a variant of the classical Richardson iteration. Repeating this Richardson iteration yields
The norm of the residual after Richardson steps would be bounded above by the convergence rate and bounded below by the norm of the GMRES residual:
References
 [1] Réka Albert and AlbertLászló Barabási. Statistical mechanics of complex networks. Rev. Mod. Phys., 74:47–97, Jan 2002.
 [2] A. Berman and R. J. Plemmons. Nonnegative Matrices in the Mathematical Sciences. Academic Press, New York, 1979. Reprinted by SIAM, Philadelphia, 1994.
 [3] Daniel Boley, Alejandro Buendia, and Golshan Golnari. Random walk laplacian and network centrality measures. arxiv.org/abs/1808.02912, 2018.
 [4] Daniel Boley, Gyan Ranjan, and ZhiLi Zhang. Commute times for a directed graph using an asymmetric Laplacian. Lin. Alg. & Appl., 435:224–242, 2011.
 [5] Alejandro Buendia and Daniel Boley. Optimized graphbased trust mechanisms using hitting times. In AAMAS Intl Workshop on Trust in Agent Societies, May 2017.
 [6] Fan Chung. Laplacians and the cheeger inequality for directed graphs. Annals of Combinatorics, 9(1):1–19, April 2005.
 [7] Fan R. K. Chung. Spectral Graph Theory. American Mathematical Society, 1997.
 [8] Michael B. Cohen, Jon Kelner, John Peebles, Richard Peng, Aaron Sidford, and Adrian Vladu. Faster algorithms for computing the stationary distribution, simulating random walks, and more. In IEEE 57th Annual Symp. on Found. Comput. Sci. (FOCS), pages 583–592, Oct 2016.
 [9] Michael B. Cohen, Jonathan Kelner, Rasmus Kyng, John Peebles, Richard Peng, Anup B. Rao, and Aaron Sidford. Solving directed laplacian systems in nearlylinear time through sparse LU factorizations. arxiv.org/abs/1811.10722, 2018.
 [10] M. Eiermann and O.G. Ernst. Geometric aspects of the theory of krylov subspace methods,. Acta Numer, 10:251–312, 2001.
 [11] Howard C. Elman. Iterative methods for large sparse nonsymmetric systems of linear equations., phd. thesis, Yale University, New Haven, 1982.
 [12] François Fouss, Alain Pirotte, JeanMichel Renders, and Marco Saerens. Randomwalk computation of similarities between nodes of a graph with application to collaborative recommendation. IEEE Transactions on Knowledge and Data Engineering, 19(3):355–369, 2007.
 [13] David Gleich, Leonid Zhukov, and Pavel Berkhin. Fast parallel pagerank: A linear system approach. In WWW 2005, 2005.

[14]
Golshan Golnari, ZhiLi Zhang, and Daniel Boley.
Markov fundamental tensor and its applications to network analysis.
Linear Algebra and Appl., 564:126–158, 2019.  [15] Charles M. Grinstead and J. Laurie Snell. Introduction to Probability. American Mathematical Society, 2nd edition, 2006. www.dartmouth.edu/ ~chance/ teaching_aids/ books_articles/ probability_book/ book.html.
 [16] R. A. Horn and C. R. Johnson. Topics in Matrix Analysis. Cambridge University Press, Cambridge, 1991.
 [17] Wayne Joubert. On the convergence behavior of the restarted GMRES algorithm for solving nonsymmetric linear systems. Num. Lin. Alg. Appl., 1(5):427–447, 1994.

[18]
J.G. Kemeny and J.L. Snell.
Finite Markov Chains
. SpringerVerlag, 1976.  [19] Yanhua Li and ZhiLi Zhang. Random walks on digraphs: A theoretical framework for estimating transmission costs in wireless routing. In The 29th IEEE Conference on Computer Communications (IEEE INFOCOM 2010), March 2010.
 [20] Yanhua Li and ZhiLi Zhang. Random walks on digraphs, the generalized digraph laplacian and the degree of asymmetry. In 7th Workshop on Algorithms and Models for Webgraphs WAW’10, Dec 1317 2010. (colocated with WINE’10), www.cs.umn.edu/ ~zhzhang/ Papers/ Yanhua Li WAW10 .pdf.
 [21] Jörg Liesen and Petr Tichý. The field of values bound on ideal GMRES. arxiv.org/abs/1211.5969, 2012.
 [22] Brandon Liu, David Parkes, and Sven Seuken. Personalized hitting time for informative trust mechanisms despite sybils. In Int’l Conf. Auto. Agents & Multiagent Sys. (AAMAS), 2016.
 [23] J. Norris. Markov Chains. Cambridge Univ. Press, 1997.
 [24] L. Page, S. Brin, R. Motwani, and T. Winograd. The pagerank citation ranking: Bringing order to the web. TR SIDLWP19990120, Computer Systems Laboratory, Stanford Univ., 1998. ilpubs.stanford.edu:8090/ 422/.
 [25] M. Richardson, R. Agrawal, and P. Domingos. Trust management for the semantic web. In ISWC, 2003. (Data from https://snap.stanford.edu/data/socEpinions1.html).
 [26] Y. Saad. Iterative Methods for Sparse Linear Systems. SIAM, 2nd edition, 2003.
 [27] Y. Saad and M. Schultz. Gmres: A generalized minimal residual algorithm for solving nonsymmetric linear systems. SIAM J. Sci. and Stat. Comput., 7:856–869, 1986.
 [28] D. A. Spielman and S.H. Teng. Nearly linear time algorithms for preconditioning and solving symmetric, diagonally dominant linear systems. SIAM J Matrix Anal, 35:835–885, 2014.
 [29] G. W. Stewart. Simultaneous iteration for computing invariant subspaces of nonhermitian matrices. Numer. Math., 25:123–136, 1976.
Comments
There are no comments yet.