The Statistical Cost of Robust Kernel Hyperparameter Tuning

06/14/2020
by   Raphael A. Meyer, et al.
5

This paper studies the statistical complexity of kernel hyperparameter tuning in the setting of active regression under adversarial noise. We consider the problem of finding the best interpolant from a class of kernels with unknown hyperparameters, assuming only that the noise is square-integrable. We provide finite-sample guarantees for the problem, characterizing how increasing the complexity of the kernel class increases the complexity of learning kernel hyperparameters. For common kernel classes (e.g. squared-exponential kernels with unknown lengthscale), our results show that hyperparameter optimization increases sample complexity by just a logarithmic factor, in comparison to the setting where optimal parameters are known in advance. Our result is based on a subsampling guarantee for linear regression under multiple design matrices, combined with an ϵ-net argument for discretizing kernel parameterizations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/04/2022

On Kernel Regression with Data-Dependent Kernels

The primary hyperparameter in kernel regression (KR) is the choice of ke...
research
01/17/2022

Efficient Hyperparameter Tuning for Large Scale Kernel Ridge Regression

Kernel methods provide a principled approach to nonparametric learning. ...
research
11/06/2020

Efficient Hyperparameter Tuning with Dynamic Accuracy Derivative-Free Optimization

Many machine learning solutions are framed as optimization problems whic...
research
07/11/2012

On-line Prediction with Kernels and the Complexity Approximation Principle

The paper describes an application of Aggregating Algorithm to the probl...
research
01/25/2019

On the Statistical Efficiency of Optimal Kernel Sum Classifiers

We propose a novel combination of optimization tools with learning theor...
research
09/01/2018

Hyperparameter Learning for Conditional Mean Embeddings with Rademacher Complexity Bounds

Conditional mean embeddings are nonparametric models that encode conditi...
research
07/16/2022

Hyperparameter Tuning in Echo State Networks

Echo State Networks represent a type of recurrent neural network with a ...

Please sign up or login with your details

Forgot password? Click here to reset