Fast and stable deterministic approximation of general symmetric kernel matrices in high dimensions

02/10/2021
by   Difeng Cai, et al.
0

Kernel methods are used frequently in various applications of machine learning. For large-scale applications, the success of kernel methods hinges on the ability to operate certain large dense kernel matrix K. To reduce the computational cost, Nystrom methods can efficiently compute a low-rank approximation to a symmetric positive semi-definite (SPSD) matrix K through landmark points and many variants have been developed in the past few years. For indefinite kernels, however, it has not even been justified whether Nystrom approximations are applicable. In this paper, we study for the first time, both theoretically and numerically, the Nystrom method for approximating general symmetric kernels, including indefinite ones. We first develop a unified theoretical framework for analyzing Nystrom approximations, which is valid for both SPSD and indefinite kernels and is independent of the specific scheme for selecting landmark points. To address the accuracy and numerical stability issues in Nystrom approximation, we then study the impact of data geometry on the spectral property of the corresponding kernel matrix and leverage the discrepancy theory to propose the anchor net method for computing Nystrom approximations. The anchor net method operates entirely on the dataset without requiring the access to K or its matrix-vector product and scales linearly for both SPSD and indefinite kernel matrices. Extensive numerical experiments suggest that indefinite kernels are much more challenging than SPSD kernels and most existing methods will suffer from numerical instability. Results on various kinds of kernels and machine learning datasets demonstrate that the new method resolves the numerical instability and achieves better accuracy with smaller computation costs compared to the state-of-the-art Nystrom methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/18/2021

Revisiting Memory Efficient Kernel Approximation: An Indefinite Learning Perspective

Matrix approximations are a key element in large-scale algebraic machine...
research
05/30/2017

Sparse and low-rank approximations of large symmetric matrices using biharmonic interpolation

Symmetric matrices are widely used in machine learning problems such as ...
research
11/03/2020

Towards a Unified Quadrature Framework for Large-Scale Kernel Machines

In this paper, we develop a quadrature framework for large-scale kernel ...
research
02/01/2012

Kernels on Sample Sets via Nonparametric Divergence Estimates

Most machine learning algorithms, such as classification or regression, ...
research
08/08/2017

Improved Fixed-Rank Nyström Approximation via QR Decomposition: Practical and Theoretical Aspects

The Nyström method is a popular technique for computing fixed-rank appro...
research
05/29/2019

Nyström landmark sampling and regularized Christoffel functions

Selecting diverse and important items from a large set is a problem of i...
research
12/27/2021

Computationally Efficient Approximations for Matrix-based Renyi's Entropy

The recently developed matrix based Renyi's entropy enables measurement ...

Please sign up or login with your details

Forgot password? Click here to reset