Krylov Subspace Recycling for Fast Iterative Least-Squares in Machine Learning

06/01/2017
by   Filip de Roos, et al.
0

Solving symmetric positive definite linear problems is a fundamental computational task in machine learning. The exact solution, famously, is cubicly expensive in the size of the matrix. To alleviate this problem, several linear-time approximations, such as spectral and inducing-point methods, have been suggested and are now in wide use. These are low-rank approximations that choose the low-rank space a priori and do not refine it over time. While this allows linear cost in the data-set size, it also causes a finite, uncorrected approximation error. Authors from numerical linear algebra have explored ways to iteratively refine such low-rank approximations, at a cost of a small number of matrix-vector multiplications. This idea is particularly interesting in the many situations in machine learning where one has to solve a sequence of related symmetric positive definite linear problems. From the machine learning perspective, such deflation methods can be interpreted as transfer learning of a low-rank approximation across a time-series of numerical tasks. We study the use of such methods for our field. Our empirical results show that, on regression and classification problems of intermediate size, this approach can interpolate between low computational cost and numerical precision.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/28/2021

Two-level Nyström–Schur preconditioner for sparse symmetric positive definite matrices

Randomized methods are becoming increasingly popular in numerical linear...
research
11/30/2020

Low rank approximation of positive semi-definite symmetric matrices using Gaussian elimination and volume sampling

Positive semi-definite matrices commonly occur as normal matrices of lea...
research
01/24/2021

Low-rank signal subspace: parameterization, projection and signal estimation

The paper contains several theoretical results related to the weighted n...
research
10/06/2021

Randomized Nyström Preconditioning

This paper introduces the Nyström PCG algorithm for solving a symmetric ...
research
10/27/2019

Spectral Algorithm for Low-rank Multitask Regression

Multitask learning, i.e. taking advantage of the relatedness of individu...
research
06/12/2020

Linear Time Sinkhorn Divergences using Positive Features

Although Sinkhorn divergences are now routinely used in data sciences to...
research
12/24/2022

Reconstructing Kernel-based Machine Learning Force Fields with Super-linear Convergence

Kernel machines have sustained continuous progress in the field of quant...

Please sign up or login with your details

Forgot password? Click here to reset