Low-Rank Updates of Matrix Square Roots

01/31/2022
by   Shany Shumeli, et al.
0

Models in which the covariance matrix has the structure of a sparse matrix plus a low rank perturbation are ubiquitous in machine learning applications. It is often desirable for learning algorithms to take advantage of such structures, avoiding costly matrix computations that often require cubic time and quadratic storage. This is often accomplished by performing operations that maintain such structures, e.g. matrix inversion via the Sherman-Morrison-Woodbury formula. In this paper we consider the matrix square root and inverse square root operations. Given a low rank perturbation to a matrix, we argue that a low-rank approximate correction to the (inverse) square root exists. We do so by establishing a geometric decay bound on the true correction's eigenvalues. We then proceed to frame the correction has the solution of an algebraic Ricatti equation, and discuss how a low-rank solution to that equation can be computed. We analyze the approximation error incurred when approximately solving the algebraic Ricatti equation, providing spectral and Frobenius norm forward and backward error bounds. Finally, we describe several applications of our algorithms, and demonstrate their utility in numerical experiments.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/16/2023

Low-rank solutions to the stochastic Helmholtz equation

In this paper, we consider low-rank approximations for the solutions to ...
research
03/27/2023

Algorithms for square root of semi-infinite quasi-Toeplitz M-matrices

A quasi-Toeplitz M-matrix A is an infinite M-matrix that can be written ...
research
08/04/2020

A Schatten-q Matrix Perturbation Theory via Perturbation Projection Error Bound

This paper studies the Schatten-q error of low-rank matrix estimation by...
research
09/27/2022

Approximate Secular Equations for the Cubic Regularization Subproblem

The cubic regularization method (CR) is a popular algorithm for unconstr...
research
12/22/2021

Squareplus: A Softplus-Like Algebraic Rectifier

We present squareplus, an activation function that resembles softplus, b...
research
11/11/2022

Re-Analyze Gauss: Bounds for Private Matrix Approximation via Dyson Brownian Motion

Given a symmetric matrix M and a vector λ, we present new bounds on the ...
research
10/30/2014

Robust sketching for multiple square-root LASSO problems

Many learning tasks, such as cross-validation, parameter search, or leav...

Please sign up or login with your details

Forgot password? Click here to reset