Bagging Provides Assumption-free Stability

01/30/2023
by   Jake A. Soloff, et al.
0

Bagging is an important technique for stabilizing machine learning models. In this paper, we derive a finite-sample guarantee on the stability of bagging for any model with bounded outputs. Our result places no assumptions on the distribution of the data, on the properties of the base algorithm, or on the dimensionality of the covariates. Our guarantee applies to many variants of bagging and is optimal up to a constant.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/28/2021

Finite-sample Efficient Conformal Prediction

Conformal prediction is a generic methodology for finite-sample valid di...
research
11/30/2021

Contrasting Identifying Assumptions of Average Causal Effects: Robustness and Semiparametric Efficiency

Semiparametric inference about average causal effects from observational...
research
05/28/2022

Provably Auditing Ordinary Least Squares in Low Dimensions

Measuring the stability of conclusions derived from Ordinary Least Squar...
research
08/07/2019

Robust Learning with Jacobian Regularization

Design of reliable systems must guarantee stability against input pertur...
research
05/03/2011

Pruning nearest neighbor cluster trees

Nearest neighbor (k-NN) graphs are widely used in machine learning and d...
research
05/30/2022

PAC Generalization via Invariant Representations

One method for obtaining generalizable solutions to machine learning tas...
research
06/22/2019

Local Exchangeability

Exchangeability---in which the distribution of an infinite sequence is i...

Please sign up or login with your details

Forgot password? Click here to reset