KSRFILS
Kyrlov Subspace Recycling for Fast Iterative Least Squares in Machine Learning using Data Sparse Matrix Computations
view repo
Solving symmetric positive definite linear problems is a fundamental computational task in machine learning. The exact solution, famously, is cubicly expensive in the size of the matrix. To alleviate this problem, several linear-time approximations, such as spectral and inducing-point methods, have been suggested and are now in wide use. These are low-rank approximations that choose the low-rank space a priori and do not refine it over time. While this allows linear cost in the data-set size, it also causes a finite, uncorrected approximation error. Authors from numerical linear algebra have explored ways to iteratively refine such low-rank approximations, at a cost of a small number of matrix-vector multiplications. This idea is particularly interesting in the many situations in machine learning where one has to solve a sequence of related symmetric positive definite linear problems. From the machine learning perspective, such deflation methods can be interpreted as transfer learning of a low-rank approximation across a time-series of numerical tasks. We study the use of such methods for our field. Our empirical results show that, on regression and classification problems of intermediate size, this approach can interpolate between low computational cost and numerical precision.
READ FULL TEXT
Matrix approximation is a common tool in machine learning for building
a...
01/15/2013 ∙ by Joonseok Lee, et al. ∙
0
∙
share
read it
This paper develops a suite of algorithms for constructing low-rank
appr...
08/31/2016 ∙ by Joel A. Tropp, et al. ∙
0
∙
share
read it
Multitask learning, i.e. taking advantage of the relatedness of individu...
10/27/2019 ∙ by Yotam Gigi, et al. ∙
20
∙
share
read it
The challenge of mastering computational tasks of enormous size tends to...
10/15/2019 ∙ by Markus Bachmayr, et al. ∙
0
∙
share
read it
In many areas of machine learning, it becomes necessary to find the
eige...
07/21/2011 ∙ by Darren Homrighausen, et al. ∙
0
∙
share
read it
We study a low-rank iterative solver for the unsteady Navier-Stokes equa...
06/16/2019 ∙ by Howard C. Elman, et al. ∙
0
∙
share
read it
Low-rank approximations of data matrices are an important dimensionality...
03/13/2018 ∙ by Reka Kovacs, et al. ∙
0
∙
share
read it
Kyrlov Subspace Recycling for Fast Iterative Least Squares in Machine Learning using Data Sparse Matrix Computations