Online nonparametric regression with Sobolev kernels

02/06/2021
āˆ™
by   Oleksandr Zadorozhnyi, et al.
āˆ™
0
āˆ™

In this work we investigate the variation of the online kernelized ridge regression algorithm in the setting of d-dimensional adversarial nonparametric regression. We derive the regret upper bounds on the classes of Sobolev spaces W_p^Ī²(š’³), pā‰„ 2, Ī²>d/p. The upper bounds are supported by the minimax regret analysis, which reveals that in the cases Ī²> d/2 or p=āˆž these rates are (essentially) optimal. Finally, we compare the performance of the kernelized ridge regression forecaster to the known non-parametric forecasters in terms of the regret rates and their computational complexity as well as to the excess risk rates in the setting of statistical (i.i.d.) nonparametric regression.

READ FULL TEXT
research
āˆ™ 02/11/2014

Online Nonparametric Regression

We establish optimal rates for online regression for arbitrary classes o...
research
āˆ™ 05/25/2018

How Many Machines Can We Use in Parallel Computing for Kernel Ridge Regression?

This paper attempts to solve a basic problem in distributed statistical ...
research
āˆ™ 02/17/2018

Nonparametric Testing under Random Projection

A common challenge in nonparametric inference is its high computational ...
research
āˆ™ 02/26/2015

A Chaining Algorithm for Online Nonparametric Regression

We consider the problem of online nonparametric regression with arbitrar...
research
āˆ™ 11/14/2021

Minimax Optimal Regression over Sobolev Spaces via Laplacian Eigenmaps on Neighborhood Graphs

In this paper we study the statistical properties of Principal Component...
research
āˆ™ 12/03/2020

Online Forgetting Process for Linear Regression Models

Motivated by the EU's "Right To Be Forgotten" regulation, we initiate a ...
research
āˆ™ 10/20/2022

Local SGD in Overparameterized Linear Regression

We consider distributed learning using constant stepsize SGD (DSGD) over...

Please sign up or login with your details

Forgot password? Click here to reset