Fast Learning Rate of Non-Sparse Multiple Kernel Learning and Optimal Regularization Strategies

11/16/2011
by   Taiji Suzuki, et al.
0

In this paper, we give a new generalization error bound of Multiple Kernel Learning (MKL) for a general class of regularizations, and discuss what kind of regularization gives a favorable predictive accuracy. Our main target in this paper is dense type regularizations including -MKL. According to the recent numerical experiments, the sparse regularization does not necessarily show a good performance compared with dense type regularizations. Motivated by this fact, this paper gives a general theoretical tool to derive fast learning rates of MKL that is applicable to arbitrary mixed-norm-type regularizations in a unifying manner. This enables us to compare the generalization performances of various types of regularizations. As a consequence, we observe that the homogeneity of the complexities of candidate reproducing kernel Hilbert spaces (RKHSs) affects which regularization strategy (ℓ1 or dense) is preferred. In fact, in homogeneous complexity settings where the complexities of all RKHSs are evenly same, ℓ1-regularization is optimal among all isotropic norms. On the other hand, in inhomogeneous complexity settings, dense type regularizations can show better learning rate than sparse ℓ1-regularization. We also show that our learning rate achieves the minimax lower bound in homogeneous complexity settings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/27/2011

Fast Learning Rate of lp-MKL and its Minimax Optimality

In this paper, we give a new sharp generalization bound of lp-MKL which ...
research
01/24/2011

Reproducing Kernel Banach Spaces with the l1 Norm II: Error Analysis for Regularized Least Square Regression

A typical approach in estimating the learning rate of a regularized lear...
research
03/02/2011

Fast Convergence Rate of Multiple Kernel Learning with Elastic-net Regularization

We investigate the learning rate of multiple kernel leaning (MKL) with e...
research
09/28/2022

Minimax Optimal Kernel Operator Learning via Multilevel Training

Learning mappings between infinite-dimensional function spaces has achie...
research
05/29/2017

Fast learning rate of deep learning via a kernel perspective

We develop a new theoretical framework to analyze the generalization err...
research
10/13/2017

Manifold regularization based on Nyström type subsampling

In this paper, we study the Nyström type subsampling for large scale ker...
research
12/19/2013

Learning rates of l^q coefficient regularization learning with Gaussian kernel

Regularization is a well recognized powerful strategy to improve the per...

Please sign up or login with your details

Forgot password? Click here to reset