A Geometric Algorithm for Scalable Multiple Kernel Learning

06/25/2012
by   John Moeller, et al.
0

We present a geometric formulation of the Multiple Kernel Learning (MKL) problem. To do so, we reinterpret the problem of learning kernel weights as searching for a kernel that maximizes the minimum (kernel) distance between two convex polytopes. This interpretation combined with novel structural insights from our geometric formulation allows us to reduce the MKL problem to a simple optimization routine that yields provable convergence as well as quality guarantees. As a result our method scales efficiently to much larger data sets than most prior methods can handle. Empirical evaluation on eleven datasets shows that we are significantly faster and even compare favorably with a uniform unweighted combination of kernels.

READ FULL TEXT

page 14

page 16

research
10/21/2021

A Geometric Approach for Computing the Kernel of a Polyhedron

We present a geometric algorithm to compute the geometric kernel of a ge...
research
02/01/2013

Sparse Multiple Kernel Learning with Geometric Convergence Rate

In this paper, we study the problem of sparse multiple kernel learning (...
research
02/14/2012

Ensembles of Kernel Predictors

This paper examines the problem of learning with a finite and possibly l...
research
02/14/2022

Polyhedron Kernel Computation Using a Geometric Approach

The geometric kernel (or simply the kernel) of a polyhedron is the set o...
research
06/27/2012

A Binary Classification Framework for Two-Stage Multiple Kernel Learning

With the advent of kernel methods, automating the task of specifying a s...
research
06/27/2012

Bayesian Efficient Multiple Kernel Learning

Multiple kernel learning algorithms are proposed to combine kernels in o...
research
03/04/2016

A Unified View of Localized Kernel Learning

Multiple Kernel Learning, or MKL, extends (kernelized) SVM by attempting...

Please sign up or login with your details

Forgot password? Click here to reset