Log In Sign Up

Large-scale Nonlinear Variable Selection via Kernel Random Features

by   Magda Gregorova, et al.

We propose a new method for input variable selection in nonlinear regression. The method is embedded into a kernel regression machine that can model general nonlinear functions, not being a priori limited to additive models. This is the first kernel-based variable selection method applicable to large datasets. It sidesteps the typical poor scaling properties of kernel methods by mapping the inputs into a relatively low-dimensional space of random features. The algorithm discovers the variables relevant for the regression task together with learning the prediction model through learning the appropriate nonlinear random feature maps. We demonstrate the outstanding performance of our method on a set of large-scale synthetic and real datasets.


page 1

page 2

page 3

page 4


Structured nonlinear variable selection

We investigate structured sparsity methods for variable selection in reg...

RFFNet: Scalable and interpretable kernel methods via Random Fourier Features

Kernel methods provide a flexible and theoretically grounded approach to...

The SKIM-FA Kernel: High-Dimensional Variable Selection and Nonlinear Interaction Discovery in Linear Time

Many scientific problems require identifying a small set of covariates t...

Bayesian Approximate Kernel Regression with Variable Selection

Nonlinear kernel regression models are often used in statistics and mach...

Nonlinear variable selection with continuous outcome: a nonparametric incremental forward stagewise approach

We present a method of variable selection for the situation where some p...

Large-scale Kernel-based Feature Extraction via Budgeted Nonlinear Subspace Tracking

Kernel-based methods enjoy powerful generalization capabilities in handl...

Kernels and Ensembles: Perspectives on Statistical Learning

Since their emergence in the 1990's, the support vector machine and the ...