Controlling Confusion via Generalisation Bounds

02/11/2022
by   Reuben Adams, et al.
10

We establish new generalisation bounds for multiclass classification by abstracting to a more general setting of discretised error types. Extending the PAC-Bayes theory, we are hence able to provide fine-grained bounds on performance for multiclass classification, as well as applications to other learning problems including discretisation of regression losses. Tractable training objectives are derived from the bounds. The bounds are uniform over all weightings of the discretised error types and thus can be used to bound weightings not foreseen at training, including the full confusion matrix in the multiclass classification case.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/22/2015

PAC-Bayes Iterated Logarithm Bounds for Martingale Mixtures

We give tight concentration bounds for mixtures of martingales that are ...
research
08/19/2019

PAC-Bayes with Backprop

We explore a method to train probabilistic neural networks by minimizing...
research
06/21/2023

More PAC-Bayes bounds: From bounded losses, to losses with general tail behaviors, to anytime-validity

In this paper, we present new high-probability PAC-Bayes bounds for diff...
research
06/12/2020

PAC-Bayes unleashed: generalisation bounds with unbounded losses

We present new PAC-Bayesian generalisation bounds for learning problems ...
research
05/31/2022

Online PAC-Bayes Learning

Most PAC-Bayesian bounds hold in the batch learning setting where data i...
research
07/25/2020

Tighter risk certificates for neural networks

This paper presents empirical studies regarding training probabilistic n...
research
12/21/2021

Risk bounds for aggregated shallow neural networks using Gaussian prior

Analysing statistical properties of neural networks is a central topic i...

Please sign up or login with your details

Forgot password? Click here to reset