DeepAI AI Chat
Log In Sign Up

Average-Case Information Complexity of Learning

11/25/2018
by   Ido Nachum, et al.
Technion
0

How many bits of information are revealed by a learning algorithm for a concept class of VC-dimension d? Previous works have shown that even for d=1 the amount of information may be unbounded (tend to ∞ with the universe size). Can it be that all concepts in the class require leaking a large amount of information? We show that typically concepts do not require leakage. There exists a proper learning algorithm that reveals O(d) bits of information for most concepts in the class. This result is a special case of a more general phenomenon we explore. If there is a low information learner when the algorithm knows the underlying distribution on inputs, then there is a learner that reveals little information on an average concept without knowing the distribution on inputs.

READ FULL TEXT

page 1

page 2

page 3

page 4

04/16/2018

A Direct Sum Result for the Information Complexity of Learning

How many bits of information are required to PAC learn a class of hypoth...
02/12/2020

Quantum Boosting

Suppose we have a weak learning algorithm A for a Boolean-valued problem...
10/01/2018

Simple Algorithms for Learning from Random Counterexamples

This work describes two simple and efficient algorithms for exactly lear...
08/08/2017

Extractor-Based Time-Space Lower Bounds for Learning

A matrix M: A × X →{-1,1} corresponds to the following learning problem:...
07/11/2011

Multi-Instance Learning with Any Hypothesis Class

In the supervised learning setting termed Multiple-Instance Learning (MI...
12/15/2019

One-Shot Induction of Generalized Logical Concepts via Human Guidance

We consider the problem of learning generalized first-order representati...
10/18/2022

Linear Guardedness and its Implications

Previous work on concept identification in neural representations has fo...