Learning Whenever Learning is Possible: Universal Learning under General Stochastic Processes

06/05/2017
by   Steve Hanneke, et al.
0

This work initiates a general study of learning and generalization without the i.i.d. assumption, starting from first principles. While the standard approach to statistical learning theory is based on assumptions chosen largely for their convenience (e.g., i.i.d. or stationary ergodic), in this work we are interested in developing a theory of learning based only on the most fundamental and natural assumptions implicit in the requirements of the learning problem itself. We specifically study universally consistent function learning, where the objective is to obtain low long-run average loss for any target function, when the data follow a given stochastic process. We are then interested in the question of whether there exist learning rules guaranteed to be universally consistent given only the assumption that universally consistent learning is possible for the given data process. The reasoning that motivates this criterion emanates from a kind of optimist's decision theory, and so we refer to such learning rules as being optimistically universal. We study this question in three natural learning settings: inductive, self-adaptive, and online. Remarkably, as our strongest positive result, we find that optimistically universal learning rules do indeed exist in the self-adaptive learning setting. Establishing this fact requires us to develop new approaches to the design of learning algorithms. Along the way, we also identify concise characterizations of the family of processes under which universally consistent learning is possible in the inductive and self-adaptive settings. We additionally pose a number of enticing open problems, particularly for the online learning setting.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/16/2022

Universal Online Learning: an Optimistically Universal Learning Rule

We study the subject of universal online learning with non-i.i.d. proces...
research
01/21/2022

Universal Online Learning with Unbounded Losses: Memory Is All You Need

We resolve an open problem of Hanneke on the subject of universally cons...
research
05/15/2020

On Learnability under General Stochastic Processes

Statistical learning theory under independent and identically distribute...
research
03/11/2022

Universally Consistent Online Learning with Arbitrarily Dependent Responses

This work provides an online learning rule that is universally consisten...
research
12/29/2021

Universal Online Learning with Bounded Loss: Reduction to Binary Classification

We study universal consistency of non-i.i.d. processes in the context of...
research
01/27/2018

Covariance-based Dissimilarity Measures Applied to Clustering Wide-sense Stationary Ergodic Processes

We introduce a new unsupervised learning problem: clustering wide-sense ...
research
03/09/2022

Universal Regression with Adversarial Responses

We provide algorithms for regression with adversarial responses under la...

Please sign up or login with your details

Forgot password? Click here to reset