Lepskii Principle in Supervised Learning

05/26/2019
by   Gilles Blanchard, et al.
0

In the setting of supervised learning using reproducing kernel methods, we propose a data-dependent regularization parameter selection rule that is adaptive to the unknown regularity of the target function and is optimal both for the least-square (prediction) error and for the reproducing kernel Hilbert space (reconstruction) norm error. It is based on a modified Lepskii balancing principle using a varying family of norms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/15/2018

Adaptivity for Regularized Kernel Methods by Lepskii's Principle

We address the problem of adaptivity in the framework of reproducing ke...
research
01/24/2011

Reproducing Kernel Banach Spaces with the l1 Norm II: Error Analysis for Regularized Least Square Regression

A typical approach in estimating the learning rate of a regularized lear...
research
10/21/2022

Learning in RKHM: a C^*-Algebraic Twist for Kernel Machines

Supervised learning in reproducing kernel Hilbert space (RKHS) and vecto...
research
08/28/2022

Statistical Inverse Problems in Hilbert Scales

In this paper, we study the Tikhonov regularization scheme in Hilbert sc...
research
08/28/2015

Regularized Kernel Recursive Least Square Algoirthm

In most adaptive signal processing applications, system linearity is ass...
research
03/28/2020

Reproducing Kernel Hilbert Spaces Approximation Bounds

We find probability error bounds for approximations of functions f in a ...
research
01/04/2019

On Reproducing Kernel Banach Spaces: Generic Definitions and Unified Framework of Constructions

Recently, there has been emerging interest in constructing reproducing k...

Please sign up or login with your details

Forgot password? Click here to reset