Online Learning for Distribution-Free Prediction

03/15/2017
by   Dave Zachariah, et al.
0

We develop an online learning method for prediction, which is important in problems with large and/or streaming data sets. We formulate the learning approach using a covariance-fitting methodology, and show that the resulting predictor has desirable computational and distribution-free properties: It is implemented online with a runtime that scales linearly in the number of samples; has a constant memory requirement; avoids local minima problems; and prunes away redundant feature dimensions without relying on restrictive assumptions on the data distribution. In conjunction with the split conformal approach, it also produces distribution-free prediction confidence intervals in a computationally efficient manner. The method is demonstrated on both real and synthetic datasets.

READ FULL TEXT
research
05/19/2017

Model-Robust Counterfactual Prediction Method

We develop a method for assessing counterfactual predictions with multip...
research
06/18/2020

Distribution-free binary classification: prediction sets, confidence intervals and calibration

We study three notions of uncertainty quantification—calibration, confid...
research
06/24/2020

AutoNCP: Automated pipelines for accurate confidence intervals

Successful application of machine learning models to real-world predicti...
research
02/15/2023

Improved Online Conformal Prediction via Strongly Adaptive Online Learning

We study the problem of uncertainty quantification via prediction sets, ...
research
06/13/2011

Efficient Transductive Online Learning via Randomized Rounding

Most traditional online learning algorithms are based on variants of mir...
research
12/30/2017

Parameter-free online learning via model selection

We introduce an efficient algorithmic framework for model selection in o...

Please sign up or login with your details

Forgot password? Click here to reset