Strengthened Information-theoretic Bounds on the Generalization Error

03/09/2019
by   Ibrahim Issa, et al.
0

The following problem is considered: given a joint distribution P_XY and an event E, bound P_XY(E) in terms of P_XP_Y(E) (where P_XP_Y is the product of the marginals of P_XY) and a measure of dependence of X and Y. Such bounds have direct applications in the analysis of the generalization error of learning algorithms, where E represents a large error event and the measure of dependence controls the degree of overfitting. Herein, bounds are demonstrated using several information-theoretic metrics, in particular: mutual information, lautum information, maximal leakage, and J_∞. The mutual information bound can outperform comparable bounds in the literature by an arbitrarily large factor.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/01/2019

Generalization Error Bounds Via Rényi-, f-Divergences and Maximal Leakage

In this work, the probability of an event under some joint distribution ...
research
01/14/2020

Robust Generalization via α-Mutual Information

The aim of this work is to provide bounds connecting two probability mea...
research
02/28/2023

Asymptotically Optimal Generalization Error Bounds for Noisy, Iterative Algorithms

We adopt an information-theoretic framework to analyze the generalizatio...
research
02/10/2022

Generalization Bounds via Convex Analysis

Since the celebrated works of Russo and Zou (2016,2019) and Xu and Ragin...
research
07/11/2012

The Minimum Information Principle for Discriminative Learning

Exponential models of distributions are widely used in machine learning ...
research
02/21/2018

Information Theoretic Co-Training

This paper introduces an information theoretic co-training objective for...
research
11/06/2019

Information-Theoretic Generalization Bounds for SGLD via Data-Dependent Estimates

In this work, we improve upon the stepwise analysis of noisy iterative l...

Please sign up or login with your details

Forgot password? Click here to reset