DeepAI
Log In Sign Up

On the numerical rank of radial basis function kernels in high dimension

06/23/2017
by   Ruoxi Wang, et al.
0

Low-rank approximations are popular methods to reduce the high computational cost of algorithms involving large-scale kernel matrices. The success of low-rank methods hinges on the matrix rank, and in practice, these methods are effective even for high-dimensional datasets. The practical success has elicited the theoretical analysis of the function rank in this paper, which is an upper bound of the matrix rank. The concept of function rank will be introduced to define the number of terms in the minimal separate form of a kernel function. We consider radial basis functions (RBF) in particular, and approximate the RBF kernel with a low-rank representation that is a finite sum of separate products, and provide explicit upper bounds on the function rank and the L_∞ error for such approximation. Our three main results are as follows. First, for a fixed precision, the function rank of RBFs, in the worst case, grows polynomially with the data dimension. Second, precise error bounds for the low-rank approximations in the L_∞ norm are derived in terms of the function smoothness and the domain diameters. And last, a group pattern in the magnitude of singular values for RBF kernel matrices is observed and analyzed, and is explained by a grouping of the expansion terms in the kernel's low-rank representation. Empirical results verify the theoretical results.

READ FULL TEXT

page 1

page 2

page 3

page 4

06/23/2017

On the numerical rank of radial basis function kernel matrices in high dimension

Low-rank approximations are popular techniques to reduce the high comput...
06/04/2021

Kernel approximation on algebraic varieties

Low-rank approximation of kernels is a fundamental mathematical problem ...
08/24/2020

Approximate Cross-Validation with Low-Rank Data in High Dimensions

Many recent advances in machine learning are driven by a challenging tri...
07/27/2021

Discrete Lehmann representation of imaginary time Green's functions

We present an efficient basis for imaginary time Green's functions based...
08/02/2016

Hierarchically Compositional Kernels for Scalable Nonparametric Learning

We propose a novel class of kernels to alleviate the high computational ...
02/04/2022

Learning Representation from Neural Fisher Kernel with Low-rank Approximation

In this paper, we study the representation of neural networks from the v...
01/22/2020

Rank Bounds for Approximating Gaussian Densities in the Tensor-Train Format

Low rank tensor approximations have been employed successfully, for exam...