Krylov Methods are (nearly) Optimal for Low-Rank Approximation

04/06/2023
by   Ainesh Bakshi, et al.
0

We consider the problem of rank-1 low-rank approximation (LRA) in the matrix-vector product model under various Schatten norms: min_u_2=1A (I - u u^⊤)_𝒮_p , where M_𝒮_p denotes the ℓ_p norm of the singular values of M. Given ε>0, our goal is to output a unit vector v such that A(I - vv^⊤)_𝒮_p≤ (1+ε) min_u_2=1A(I - u u^⊤)_𝒮_p. Our main result shows that Krylov methods (nearly) achieve the information-theoretically optimal number of matrix-vector products for Spectral (p=∞), Frobenius (p=2) and Nuclear (p=1) LRA. In particular, for Spectral LRA, we show that any algorithm requires Ω(log(n)/ε^1/2) matrix-vector products, exactly matching the upper bound obtained by Krylov methods [MM15, BCW22]. Our lower bound addresses Open Question 1 in [Woo14], providing evidence for the lack of progress on algorithms for Spectral LRA and resolves Open Question 1.2 in [BCW22]. Next, we show that for any fixed constant p, i.e. 1≤ p =O(1), there is an upper bound of O(log(1/ε)/ε^1/3) matrix-vector products, implying that the complexity does not grow as a function of input size. This improves the O(log(n/ε)/ε^1/3) bound recently obtained in [BCW22], and matches their Ω(1/ε^1/3) lower bound, to a log(1/ε) factor.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset