
Information, learning and falsification
There are (at least) three approaches to quantifying information. The fi...
read it

Learning Theory Approach to Minimum Error Entropy Criterion
We consider the minimum error entropy (MEE) criterion and an empirical r...
read it

Learning opacity in Stratal Maximum Entropy Grammar
Opaque phonological patterns are sometimes claimed to be difficult to le...
read it

The Topology of Statistical Verifiability
Topological models of empirical and formal inquiry are increasingly prev...
read it

A Mathematical Theory of Learning
In this paper, a mathematical theory of learning is proposed that has ma...
read it

Supervising Feature Influence
Causal influence measures for machine learnt classifiers shed light on t...
read it

Permutational Rademacher Complexity: a New Complexity Measure for Transductive Learning
Transductive learning considers situations when a learner observes m lab...
read it
Falsification and future performance
We informationtheoretically reformulate two measures of capacity from statistical learning theory: empirical VCentropy and empirical Rademacher complexity. We show these capacity measures count the number of hypotheses about a dataset that a learning algorithm falsifies when it finds the classifier in its repertoire minimizing empirical risk. It then follows from that the future performance of predictors on unseen data is controlled in part by how many hypotheses the learner falsifies. As a corollary we show that empirical VCentropy quantifies the message length of the true hypothesis in the optimal code of a particular probability distribution, the socalled actual repertoire.
READ FULL TEXT
Comments
There are no comments yet.