Towards a Unified Information-Theoretic Framework for Generalization

11/09/2021
by   Mahdi Haghifam, et al.
0

In this work, we investigate the expressiveness of the "conditional mutual information" (CMI) framework of Steinke and Zakynthinou (2020) and the prospect of using it to provide a unified framework for proving generalization bounds in the realizable setting. We first demonstrate that one can use this framework to express non-trivial (but sub-optimal) bounds for any learning algorithm that outputs hypotheses from a class of bounded VC dimension. We prove that the CMI framework yields the optimal bound on the expected risk of Support Vector Machines (SVMs) for learning halfspaces. This result is an application of our general result showing that stable compression schemes Bousquet al. (2020) of size k have uniformly bounded CMI of order O(k). We further show that an inherent limitation of proper learning of VC classes contradicts the existence of a proper learner with constant CMI, and it implies a negative resolution to an open problem of Steinke and Zakynthinou (2020). We further study the CMI of empirical risk minimizers (ERMs) of class H and show that it is possible to output all consistent classifiers (version space) with bounded CMI if and only if H has a bounded star number (Hanneke and Yang (2015)). Moreover, we prove a general reduction showing that "leave-one-out" analysis is expressible via the CMI framework. As a corollary we investigate the CMI of the one-inclusion-graph algorithm proposed by Haussler et al. (1994). More generally, we show that the CMI framework is universal in the sense that for every consistent algorithm and data distribution, the expected risk vanishes as the number of samples diverges if and only if its evaluated CMI has sublinear growth with the number of samples.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/29/2022

Understanding Generalization via Leave-One-Out Conditional Mutual Information

We study the mutual information between (certain summaries of) the outpu...
research
01/24/2020

Reasoning About Generalization via Conditional Mutual Information

We provide an information-theoretic framework for studying the generaliz...
research
05/24/2020

Proper Learning, Helly Number, and an Optimal SVM Bound

The classical PAC sample complexity bounds are stated for any Empirical ...
research
11/05/2020

On the Information Complexity of Proper Learners for VC Classes in the Realizable Case

We provide a negative resolution to a conjecture of Steinke and Zakynthi...
research
10/12/2022

Evaluated CMI Bounds for Meta Learning: Tightness and Expressiveness

Recent work has established that the conditional mutual information (CMI...
research
11/02/2019

Order Optimal One-Shot Distributed Learning

We consider distributed statistical optimization in one-shot setting, wh...
research
07/31/2023

Boundedness for proper conflict-free and odd colorings

The proper conflict-free chromatic number, χ_pcf(G), of a graph G is the...

Please sign up or login with your details

Forgot password? Click here to reset