Large-scale Kernel-based Feature Extraction via Budgeted Nonlinear Subspace Tracking

01/28/2016
by   Fateme Sheikholeslami, et al.
0

Kernel-based methods enjoy powerful generalization capabilities in handling a variety of learning tasks. When such methods are provided with sufficient training data, broadly-applicable classes of nonlinear functions can be approximated with desired accuracy. Nevertheless, inherent to the nonparametric nature of kernel-based estimators are computational and memory requirements that become prohibitive with large-scale datasets. In response to this formidable challenge, the present work puts forward a low-rank, kernel-based, feature extraction approach that is particularly tailored for online operation, where data streams need not be stored in memory. A novel generative model is introduced to approximate high-dimensional (possibly infinite) features via a low-rank nonlinear subspace, the learning of which leads to a direct kernel function approximation. Offline and online solvers are developed for the subspace learning task, along with affordable versions, in which the number of stored data vectors is confined to a predefined budget. Analytical results provide performance bounds on how well the kernel matrix as well as kernel-based classification and regression tasks can be approximated by leveraging budgeted online subspace learning and feature extraction schemes. Tests on synthetic and real datasets demonstrate and benchmark the efficiency of the proposed method when linear classification and regression is applied to the extracted features.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/17/2017

Non-Linear Subspace Clustering with Learned Low-Rank Kernels

In this paper, we present a kernel subspace clustering method that can h...
research
04/19/2018

Large-scale Nonlinear Variable Selection via Kernel Random Features

We propose a new method for input variable selection in nonlinear regres...
research
02/20/2020

Nyström Subspace Learning for Large-scale SVMs

As an implementation of the Nyström method, Nyström computational regula...
research
12/28/2017

Random Feature-based Online Multi-kernel Learning in Environments with Unknown Dynamics

Kernel-based methods exhibit well-documented performance in various nonl...
research
04/06/2022

Consensual Aggregation on Random Projected High-dimensional Features for Regression

In this paper, we present a study of a kernel-based consensual aggregati...
research
12/24/2022

Reconstructing Kernel-based Machine Learning Force Fields with Super-linear Convergence

Kernel machines have sustained continuous progress in the field of quant...
research
10/31/2018

Low-Precision Random Fourier Features for Memory-Constrained Kernel Approximation

We investigate how to train kernel approximation methods that generalize...

Please sign up or login with your details

Forgot password? Click here to reset