Tighter Information-Theoretic Generalization Bounds from Supersamples

02/05/2023
by   Ziqiao Wang, et al.
0

We present a variety of novel information-theoretic generalization bounds for learning algorithms, from the supersample setting of Steinke Zakynthinou (2020)-the setting of the "conditional mutual information" framework. Our development exploits projecting the loss pair (obtained from a training instance and a testing instance) down to a single number and correlating loss values with a Rademacher sequence (and its shifted variants). The presented bounds include square-root bounds, fast-rate bounds, including those based on variance and sharpness, and bounds for interpolating algorithms etc. We show theoretically or empirically that these bounds are tighter than all information-theoretic bounds known to date on the same supersample setting.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/26/2023

On the tightness of information-theoretic bounds on generalization error of learning algorithms

A recent line of works, initiated by Russo and Xu, has shown that the ge...
research
10/04/2021

Information-theoretic generalization bounds for black-box learning algorithms

We derive information-theoretic generalization bounds for supervised lea...
research
05/06/2022

Fast Rate Generalization Error Bounds: Variations on a Theme

A recent line of works, initiated by Russo and Xu, has shown that the ge...
research
12/27/2022

Limitations of Information-Theoretic Generalization Bounds for Gradient Descent Methods in Stochastic Convex Optimization

To date, no "information-theoretic" frameworks for reasoning about gener...
research
02/04/2022

Improved Information Theoretic Generalization Bounds for Distributed and Federated Learning

We consider information-theoretic bounds on expected generalization erro...
research
07/07/2021

Information-theoretic characterization of the complete genotype-phenotype map of a complex pre-biotic world

How information is encoded in bio-molecular sequences is difficult to qu...
research
07/01/2022

On Leave-One-Out Conditional Mutual Information For Generalization

We derive information theoretic generalization bounds for supervised lea...

Please sign up or login with your details

Forgot password? Click here to reset