Rademacher Generalization Bounds for Classifier Chains

07/26/2018
by   Moura Simon, et al.
0

In this paper, we propose a new framework to study the generalization property of classifier chains trained over observations associated with multiple and interdependent class labels. The results are based on large deviation inequalities for Lipschitz functions of weakly dependent sequences proposed by Rio in 2000. We believe that the resulting generalization error bound brings many advantages and could be adapted to other frameworks that consider interdependent outputs. First, it explicitly exhibits the dependencies between class labels. Secondly, it provides insights of the effect of the order of the chain on the algorithm generalization performances. Finally, the two dependency coefficients that appear in the bound could also be used to design new strategies to decide the order of the chain.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/03/2019

Deviation inequalities for separately Lipschitz functionals of composition of random functions

We consider a class of non-homogeneous Markov chains, that contains many...
research
02/17/2021

Deviation inequalities for stochastic approximation by averaging

We introduce a class of Markov chains, that contains the model of stocha...
research
12/23/2021

Generalization Error Bounds on Deep Learning with Markov Datasets

In this paper, we derive upper bounds on generalization errors for deep ...
research
08/02/2021

Generalization bounds for nonparametric regression with β-mixing samples

In this paper we present a series of results that permit to extend in a ...
research
02/23/2020

On the generalization of bayesian deep nets for multi-class classification

Generalization bounds which assess the difference between the true risk ...
research
07/09/2020

Predictive Value Generalization Bounds

In this paper, we study a bi-criterion framework for assessing scoring f...

Please sign up or login with your details

Forgot password? Click here to reset