Learning the kernel matrix via predictive low-rank approximations

01/17/2016
by   Martin Stražar, et al.
0

Efficient and accurate low-rank approximations of multiple data sources are essential in the era of big data. The scaling of kernel-based learning algorithms to large datasets is limited by the O(n^2) computation and storage complexity of the full kernel matrix, which is required by most of the recent kernel learning algorithms. We present the Mklaren algorithm to approximate multiple kernel matrices learn a regression model, which is entirely based on geometrical concepts. The algorithm does not require access to full kernel matrices yet it accounts for the correlations between all kernels. It uses Incomplete Cholesky decomposition, where pivot selection is based on least-angle regression in the combined, low-dimensional feature space. The algorithm has linear complexity in the number of data points and kernels. When explicit feature space induced by the kernel can be constructed, a mapping from the dual to the primal Ridge regression weights is used for model interpretation. The Mklaren algorithm was tested on eight standard regression datasets. It outperforms contemporary kernel matrix approximation approaches when learning with multiple kernels. It identifies relevant kernels, achieving highest explained variance than other multiple kernel learning methods for the same number of iterations. Test accuracy, equivalent to the one using full kernel matrices, was achieved with at significantly lower approximation ranks. A difference in run times of two orders of magnitude was observed when either the number of samples or kernels exceeds 3000.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/17/2017

Non-Linear Subspace Clustering with Learned Low-Rank Kernels

In this paper, we present a kernel subspace clustering method that can h...
research
05/25/2019

Fast and Accurate Gaussian Kernel Ridge Regression Using Matrix Decompositions for Preconditioning

This paper presents a method for building a preconditioner for a kernel ...
research
09/23/2019

Scalable Kernel Learning via the Discriminant Information

Kernel approximation methods have been popular techniques for scalable k...
research
10/26/2021

Tensor Network Kalman Filtering for Large-Scale LS-SVMs

Least squares support vector machines are a commonly used supervised lea...
research
02/22/2016

Preconditioning Kernel Matrices

The computational and storage complexity of kernel machines presents the...
research
06/13/2020

A New Algorithm for Tessellated Kernel Learning

The accuracy and complexity of machine learning algorithms based on kern...
research
04/15/2023

Efficient Convex Algorithms for Universal Kernel Learning

The accuracy and complexity of machine learning algorithms based on kern...

Please sign up or login with your details

Forgot password? Click here to reset