Fast Convergence Rate of Multiple Kernel Learning with Elastic-net Regularization

03/02/2011
by   Taiji Suzuki, et al.
0

We investigate the learning rate of multiple kernel leaning (MKL) with elastic-net regularization, which consists of an ℓ_1-regularizer for inducing the sparsity and an ℓ_2-regularizer for controlling the smoothness. We focus on a sparse setting where the total number of kernels is large but the number of non-zero components of the ground truth is relatively small, and prove that elastic-net MKL achieves the minimax learning rate on the ℓ_2-mixed-norm ball. Our bound is sharper than the convergence rates ever shown, and has a property that the smoother the truth is, the faster the convergence rate is.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/02/2012

Fast learning rate of multiple kernel learning: Trade-off between sparsity and smoothness

We investigate the learning rate of multiple kernel learning (MKL) with ...
research
03/27/2011

Sharp Convergence Rate and Support Consistency of Multiple Kernel Learning with Sparse and Dense Regularization

We theoretically investigate the convergence rate and support consistenc...
research
03/27/2011

Fast Learning Rate of lp-MKL and its Minimax Optimality

In this paper, we give a new sharp generalization bound of lp-MKL which ...
research
01/15/2010

Sparsity-accuracy trade-off in MKL

We empirically investigate the best trade-off between sparse and uniform...
research
11/16/2011

Fast Learning Rate of Non-Sparse Multiple Kernel Learning and Optimal Regularization Strategies

In this paper, we give a new generalization error bound of Multiple Kern...
research
11/06/2018

Elastic CoCoA: Scaling In to Improve Convergence

In this paper we experimentally analyze the convergence behavior of CoCo...
research
02/01/2013

Sparse Multiple Kernel Learning with Geometric Convergence Rate

In this paper, we study the problem of sparse multiple kernel learning (...

Please sign up or login with your details

Forgot password? Click here to reset