A short note on extension theorems and their connection to universal consistency in machine learning

04/15/2016
by   Andreas Christmann, et al.
0

Statistical machine learning plays an important role in modern statistics and computer science. One main goal of statistical machine learning is to provide universally consistent algorithms, i.e., the estimator converges in probability or in some stronger sense to the Bayes risk or to the Bayes decision function. Kernel methods based on minimizing the regularized risk over a reproducing kernel Hilbert space (RKHS) belong to these statistical machine learning methods. It is in general unknown which kernel yields optimal results for a particular data set or for the unknown probability measure. Hence various kernel learning methods were proposed to choose the kernel and therefore also its RKHS in a data adaptive manner. Nevertheless, many practitioners often use the classical Gaussian RBF kernel or certain Sobolev kernels with good success. The goal of this short note is to offer one possible theoretical explanation for this empirical fact.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/22/2017

Total stability of kernel methods

Regularized empirical risk minimization using kernels and their correspo...
research
10/12/2015

On the Robustness of Regularized Pairwise Learning Methods Based on Kernels

Regularized empirical risk minimization including support vector machine...
research
05/14/2014

Learning rates for the risk of kernel based quantile regression estimators in additive models

Additive models play an important role in semiparametric statistics. Thi...
research
03/20/2012

Asymptotic Confidence Sets for General Nonparametric Regression and Classification by Regularized Kernel Methods

Regularized kernel methods such as, e.g., support vector machines and le...
research
03/08/2019

Kernel Based Estimation of Spectral Risk Measures

Spectral risk measures (SRMs) belongs to the family of coherent risk mea...
research
03/27/2023

On the Connection between L_p and Risk Consistency and its Implications on Regularized Kernel Methods

As a predictor's quality is often assessed by means of its risk, it is n...
research
01/10/2019

A witness function based construction of discriminative models using Hermite polynomials

In machine learning, we are given a dataset of the form {(x_j,y_j)}_j=1^...

Please sign up or login with your details

Forgot password? Click here to reset