On-line Prediction with Kernels and the Complexity Approximation Principle

07/11/2012
by   Alex Gammerman, et al.
0

The paper describes an application of Aggregating Algorithm to the problem of regression. It generalizes earlier results concerned with plain linear regression to kernel techniques and presents an on-line algorithm which performs nearly as well as any oblivious kernel predictor. The paper contains the derivation of an estimate on the performance of this algorithm. The estimate is then used to derive an application of the Complexity Approximation Principle to kernel methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/30/2020

Kernel Selection for Modal Linear Regression: Optimal Kernel and IRLS Algorithm

Modal linear regression (MLR) is a method for obtaining a conditional mo...
research
09/26/2018

Generalization Properties of hyper-RKHS and its Application to Out-of-Sample Extensions

Hyper-kernels endowed by hyper-Reproducing Kernel Hilbert Space (hyper-R...
research
06/14/2020

The Statistical Cost of Robust Kernel Hyperparameter Tuning

This paper studies the statistical complexity of kernel hyperparameter t...
research
01/04/2016

Robust Non-linear Regression: A Greedy Approach Employing Kernels with Application to Image Denoising

We consider the task of robust non-linear regression in the presence of ...
research
09/05/2018

IKA: Independent Kernel Approximator

This paper describes a new method for low rank kernel approximation call...
research
03/16/2018

Optimal Boundary Kernels and Weightings for Local Polynomial Regression

Kernel smoothers are considered near the boundary of the interval. Kerne...
research
02/14/2015

Nonparametric regression using needlet kernels for spherical data

Needlets have been recognized as state-of-the-art tools to tackle spheri...

Please sign up or login with your details

Forgot password? Click here to reset