The Local Rademacher Complexity of Lp-Norm Multiple Kernel Learning

03/03/2011
by   Marius Kloft, et al.
0

We derive an upper bound on the local Rademacher complexity of ℓ_p-norm multiple kernel learning, which yields a tighter excess risk bound than global approaches. Previous local approaches aimed at analyzed the case p=1 only while our analysis covers all cases 1≤ p≤∞, assuming the different feature mappings corresponding to the different kernels to be uncorrelated. We also show a lower bound that shows that the bound is tight, and derive consequences regarding excess loss, namely fast convergence rates of the order O(n^-α/1+α), where α is the minimum eigenvalue decay rate of the individual kernels.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/06/2018

Near-Optimal Coresets of Kernel Density Estimates

We construct near-optimal coresets for kernel density estimate for point...
research
03/27/2011

Fast Learning Rate of lp-MKL and its Minimax Optimality

In this paper, we give a new sharp generalization bound of lp-MKL which ...
research
08/27/2019

On the Risk of Minimum-Norm Interpolants and Restricted Lower Isometry of Kernels

We study the risk of minimum-norm interpolants of data in a Reproducing ...
research
08/01/2018

Just Interpolate: Kernel "Ridgeless" Regression Can Generalize

In the absence of explicit regularization, Kernel "Ridgeless" Regression...
research
03/04/2011

Multiple Kernel Learning: A Unifying Probabilistic Viewpoint

We present a probabilistic viewpoint to multiple kernel learning unifyin...
research
04/09/2021

How rotational invariance of common kernels prevents generalization in high dimensions

Kernel ridge regression is well-known to achieve minimax optimal rates i...
research
02/14/2012

Lipschitz Parametrization of Probabilistic Graphical Models

We show that the log-likelihood of several probabilistic graphical model...

Please sign up or login with your details

Forgot password? Click here to reset