Convergence rates of Kernel Conjugate Gradient for random design regression

07/08/2016
by   Gilles Blanchard, et al.
0

We prove statistical rates of convergence for kernel-based least squares regression from i.i.d. data using a conjugate gradient algorithm, where regularization against overfitting is obtained by early stopping. This method is related to Kernel Partial Least Squares, a regression method that combines supervised dimensionality reduction with least squares projection. Following the setting introduced in earlier related literature, we study so-called "fast convergence rates" depending on the regularity of the target regression function (measured by a source condition in terms of the kernel integral operator) and on the effective dimensionality of the data mapped into the kernel space. We obtain upper bounds, essentially matching known minimax lower bounds, for the L^2 (prediction) norm as well as for the stronger Hilbert norm, if the true regression function belongs to the reproducing kernel Hilbert space. If the latter assumption is not fulfilled, we obtain similar convergence rates for appropriate norms, provided additional unlabeled data are available.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/29/2010

Optimal learning rates for Kernel Conjugate Gradient regression

We prove rates of convergence in the statistical sense for kernel-based ...
research
11/12/2016

Kernel regression, minimax rates and effective dimensionality: beyond the regular case

We investigate if kernel regularization methods can achieve minimax conv...
research
11/20/2022

Statistical Optimality of Divide and Conquer Kernel-based Functional Linear Regression

Previous analysis of regularized functional linear regression in a repro...
research
11/05/2018

Kernel Conjugate Gradient Methods with Random Projections

We propose and study kernel conjugate gradient methods (KCGM) with rando...
research
08/11/2016

Distributed learning with regularized least squares

We study distributed learning with the least squares regularization sche...
research
02/01/2021

Fast rates in structured prediction

Discrete supervised learning problems such as classification are often t...
research
04/15/2018

Adaptivity for Regularized Kernel Methods by Lepskii's Principle

We address the problem of adaptivity in the framework of reproducing ke...

Please sign up or login with your details

Forgot password? Click here to reset