Information-theoretic generalization bounds for black-box learning algorithms

10/04/2021
by   Hrayr Harutyunyan, et al.
2

We derive information-theoretic generalization bounds for supervised learning algorithms based on the information contained in predictions rather than in the output of the training algorithm. These bounds improve over the existing information-theoretic bounds, are applicable to a wider range of algorithms, and solve two key challenges: (a) they give meaningful results for deterministic algorithms and (b) they are significantly easier to estimate. We show experimentally that the proposed bounds closely follow the generalization gap in practical scenarios for deep learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/01/2022

On Leave-One-Out Conditional Mutual Information For Generalization

We derive information theoretic generalization bounds for supervised lea...
research
02/05/2023

Tighter Information-Theoretic Generalization Bounds from Supersamples

We present a variety of novel information-theoretic generalization bound...
research
12/07/2017

Is My Model Flexible Enough? Information-Theoretic Model Check

The choice of model class is fundamental in statistical learning and sys...
research
12/07/2017

How consistent is my model with the data? Information-Theoretic Model Check

The choice of model class is fundamental in statistical learning and sys...
research
12/09/2011

An Information Theoretic Analysis of Decision in Computer Chess

The basis of the method proposed in this article is the idea that inform...
research
01/28/2022

Stochastic Chaining and Strengthened Information-Theoretic Generalization Bounds

We propose a new approach to apply the chaining technique in conjunction...
research
06/21/2019

Robustness of Dynamical Quantities of Interest via Goal-Oriented Information Theory

Variational-principle-based methods that relate expectations of a quanti...

Please sign up or login with your details

Forgot password? Click here to reset