Predictive PAC Learning and Process Decompositions

09/19/2013
by   Cosma Rohilla Shalizi, et al.
0

We informally call a stochastic process learnable if it admits a generalization error approaching zero in probability for any concept class with finite VC-dimension (IID processes are the simplest example). A mixture of learnable processes need not be learnable itself, and certainly its generalization error need not decay at the same rate. In this paper, we argue that it is natural in predictive PAC to condition not on the past observations but on the mixture component of the sample path. This definition not only matches what a realistic learner might demand, but also allows us to sidestep several otherwise grave problems in learning from dependent data. In particular, we give a novel PAC generalization bound for mixtures of learnable processes with a generalization error that is not worse than that of each mixture component. We also provide a characterization of mixtures of absolutely regular (β-mixing) processes, of independent probability-theoretic interest.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/12/2019

VC Classes are Adversarially Robustly Learnable, but Only Improperly

We study the question of learning an adversarially robust predictor. We ...
research
08/11/2023

On the equivalence of Occam algorithms

Blumer et al. (1987, 1989) showed that any concept class that is learnab...
research
06/09/2020

Probably Approximately Correct Constrained Learning

As learning solutions reach critical applications in social, industrial,...
research
03/09/2023

Computably Continuous Reinforcement-Learning Objectives are PAC-learnable

In reinforcement learning, the classic objectives of maximizing discount...
research
03/30/2023

Online Learning and Disambiguations of Partial Concept Classes

In a recent article, Alon, Hanneke, Holzman, and Moran (FOCS '21) introd...
research
02/06/2023

Find a witness or shatter: the landscape of computable PAC learning

This paper contributes to the study of CPAC learnability – a computable ...
research
09/07/2023

Mixtures of Gaussians are Privately Learnable with a Polynomial Number of Samples

We study the problem of estimating mixtures of Gaussians under the const...

Please sign up or login with your details

Forgot password? Click here to reset