Adaptivity for Regularized Kernel Methods by Lepskii's Principle

04/15/2018
by   Nicole Mücke, et al.
0

We address the problem of adaptivity in the framework of reproducing kernel Hilbert space (RKHS) regression. More precisely, we analyze estimators arising from a linear regularization scheme g_. In practical applications, an important task is to choose the regularization parameter appropriately, i.e. based only on the given data and independently on unknown structural assumptions on the regression function. An attractive approach avoiding data-splitting is the Lepskii Principle (LP), also known as the Balancing Principle is this setting. We show that a modified parameter choice based on (LP) is minimax optimal adaptive, up to (n). A convenient result is the fact that balancing in L^2(ν)- norm, which is easiest, automatically gives optimal balancing in all stronger norms, interpolating between L^2(ν) and the RKHS. An analogous result is open for other classical approaches to data dependent choices of the regularization parameter, e.g. for Hold-Out.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset