On the Statistical Efficiency of Optimal Kernel Sum Classifiers

01/25/2019
by   Raphael Arkady Meyer, et al.
0

We propose a novel combination of optimization tools with learning theory bounds in order to analyze the sample complexity of optimal kernel sum classifiers. This contrasts the typical learning theoretic results which hold for all (potentially suboptimal) classifiers. Our work also justifies assumptions made in prior work on multiple kernel learning. As a byproduct of our analysis, we also provide a new form of Rademacher complexity for hypothesis classes containing only optimal classifiers.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/01/2021

Optimal Linear Combination of Classifiers

The question of whether to use one classifier or a combination of classi...
research
02/12/2018

On the Sample Complexity of Learning from a Sequence of Experiments

We analyze the sample complexity of a new problem: learning from a seque...
research
02/13/2014

Regularization for Multiple Kernel Learning via Sum-Product Networks

In this paper, we are interested in constructing general graph-based reg...
research
06/14/2020

The Statistical Cost of Robust Kernel Hyperparameter Tuning

This paper studies the statistical complexity of kernel hyperparameter t...
research
05/30/2017

High Dimensional Structured Superposition Models

High dimensional superposition models characterize observations using pa...
research
03/08/2018

Learning with Rules

Complex classifiers may exhibit "embarassing" failures in cases that wou...
research
11/11/2013

Learning Mixtures of Linear Classifiers

We consider a discriminative learning (regression) problem, whereby the ...

Please sign up or login with your details

Forgot password? Click here to reset