On the Information Complexity of Proper Learners for VC Classes in the Realizable Case

11/05/2020
by   Mahdi Haghifam, et al.
0

We provide a negative resolution to a conjecture of Steinke and Zakynthinou (2020a), by showing that their bound on the conditional mutual information (CMI) of proper learners of Vapnik–Chervonenkis (VC) classes cannot be improved from d log n +2 to O(d), where n is the number of i.i.d. training examples. In fact, we exhibit VC classes for which the CMI of any proper learner cannot be bounded by any real-valued function of the VC dimension only.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset