On the Information Complexity of Proper Learners for VC Classes in the Realizable Case

11/05/2020
by   Mahdi Haghifam, et al.
0

We provide a negative resolution to a conjecture of Steinke and Zakynthinou (2020a), by showing that their bound on the conditional mutual information (CMI) of proper learners of Vapnik–Chervonenkis (VC) classes cannot be improved from d log n +2 to O(d), where n is the number of i.i.d. training examples. In fact, we exhibit VC classes for which the CMI of any proper learner cannot be bounded by any real-valued function of the VC dimension only.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/17/2021

Fast Rates for Nonparametric Online Learning: From Realizability to Learning in Games

We study fast rates of convergence in the setting of nonparametric onlin...
research
05/21/2018

Sample Compression for Real-Valued Learners

We give an algorithmically efficient version of the learner-to-compressi...
research
11/09/2021

Towards a Unified Information-Theoretic Framework for Generalization

In this work, we investigate the expressiveness of the "conditional mutu...
research
11/10/2022

Probabilistically Robust PAC Learning

Recently, Robey et al. propose a notion of probabilistic robustness, whi...
research
01/30/2017

Interaction Information for Causal Inference: The Case of Directed Triangle

Interaction information is one of the multivariate generalizations of mu...
research
01/13/2021

On Misspecification in Prediction Problems and Robustness via Improper Learning

We study probabilistic prediction games when the underlying model is mis...
research
04/16/2021

A Further Study of Quadratic APN Permutations in Dimension Nine

Recently, Beierle and Leander found two new sporadic quadratic APN permu...

Please sign up or login with your details

Forgot password? Click here to reset