Spectral Sparsification and Regret Minimization Beyond Matrix Multiplicative Updates

06/16/2015
by   Zeyuan Allen-Zhu, et al.
0

In this paper, we provide a novel construction of the linear-sized spectral sparsifiers of Batson, Spielman and Srivastava [BSS14]. While previous constructions required Ω(n^4) running time [BSS14, Zou12], our sparsification routine can be implemented in almost-quadratic running time O(n^2+ε). The fundamental conceptual novelty of our work is the leveraging of a strong connection between sparsification and a regret minimization problem over density matrices. This connection was known to provide an interpretation of the randomized sparsifiers of Spielman and Srivastava [SS11] via the application of matrix multiplicative weight updates (MWU) [CHS11, Vis14]. In this paper, we explain how matrix MWU naturally arises as an instance of the Follow-the-Regularized-Leader framework and generalize this approach to yield a larger class of updates. This new class allows us to accelerate the construction of linear-sized spectral sparsifiers, and give novel insights on the motivation behind Batson, Spielman and Srivastava [BSS14].

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset