On the Expressive Power of Kernel Methods and the Efficiency of Kernel Learning by Association Schemes

02/13/2019
by   Pravesh K. Kothari, et al.
0

We study the expressive power of kernel methods and the algorithmic feasibility of multiple kernel learning for a special rich class of kernels. Specifically, we define Euclidean kernels, a diverse class that includes most, if not all, families of kernels studied in literature such as polynomial kernels and radial basis functions. We then describe the geometric and spectral structure of this family of kernels over the hypercube (and to some extent for any compact domain). Our structural results allow us to prove meaningful limitations on the expressive power of the class as well as derive several efficient algorithms for learning kernels over different domains.

READ FULL TEXT
research
02/07/2022

Random Gegenbauer Features for Scalable Kernel Methods

We propose efficient random features for approximating a new and rich cl...
research
05/06/2023

Approximation by non-symmetric networks for cross-domain learning

For the past 30 years or so, machine learning has stimulated a great dea...
research
12/19/2014

A la Carte - Learning Fast Kernels

Kernel methods have great promise for learning rich statistical represen...
research
11/15/2017

A Convex Parametrization of a New Class of Universal Kernel Functions for use in Kernel Learning

We propose a new class of universal kernel functions which admit a linea...
research
03/25/2016

On kernel methods for covariates that are rankings

Permutation-valued features arise in a variety of applications, either i...
research
10/22/2021

Graph Filtration Kernels

The majority of popular graph kernels is based on the concept of Haussle...
research
05/15/2019

Expressive Priors in Bayesian Neural Networks: Kernel Combinations and Periodic Functions

A simple, flexible approach to creating expressive priors in Gaussian pr...

Please sign up or login with your details

Forgot password? Click here to reset