Integral Probability Metrics PAC-Bayes Bounds

07/01/2022
by   Ron Amit, et al.
0

We present a PAC-Bayes-style generalization bound which enables the replacement of the KL-divergence with a variety of Integral Probability Metrics (IPM). We provide instances of this bound with the IPM being the total variation metric and the Wasserstein distance. A notable feature of the obtained bounds is that they naturally interpolate between classical uniform convergence bounds in the worst case (when the prior and posterior are far away from each other), and preferable bounds in better cases (when the posterior and prior are close). This illustrates the possibility of reinforcing classical generalization bounds with algorithm- and data-dependent components, thus making them more suitable to analyze algorithms that use a large hypothesis space.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/24/2020

A Limitation of the PAC-Bayes Framework

PAC-Bayes is a useful framework for deriving generalization bounds which...
research
06/19/2020

On the role of data in PAC-Bayes bounds

The dominant term in PAC-Bayes bounds is often the Kullback–Leibler dive...
research
06/22/2015

PAC-Bayes Iterated Logarithm Bounds for Martingale Mixtures

We give tight concentration bounds for mixtures of martingales that are ...
research
06/18/2018

PAC-Bayes bounds for stable algorithms with instance-dependent priors

PAC-Bayes bounds have been proposed to get risk estimates based on a tra...
research
02/07/2023

A unified recipe for deriving (time-uniform) PAC-Bayes bounds

We present a unified framework for deriving PAC-Bayesian generalization ...
research
10/20/2022

PAC-Bayesian Learning of Optimization Algorithms

We apply the PAC-Bayes theory to the setting of learning-to-optimize. To...

Please sign up or login with your details

Forgot password? Click here to reset