MP-Boost: Minipatch Boosting via Adaptive Feature and Observation Sampling

11/14/2020
by   Mohammad Taha Toghani, et al.
9

Boosting methods are among the best general-purpose and off-the-shelf machine learning approaches, gaining widespread popularity. In this paper, we seek to develop a boosting method that yields comparable accuracy to popular AdaBoost and gradient boosting methods, yet is faster computationally and whose solution is more interpretable. We achieve this by developing MP-Boost, an algorithm loosely based on AdaBoost that learns by adaptively selecting small subsets of instances and features, or what we term minipatches (MP), at each iteration. By sequentially learning on tiny subsets of the data, our approach is computationally faster than other classic boosting algorithms. Also as it progresses, MP-Boost adaptively learns a probability distribution on the features and instances that upweight the most important features and challenging instances, hence adaptively selecting the most relevant minipatches for learning. These learned probability distributions also aid in interpretation of our method. We empirically demonstrate the interpretability, comparative accuracy, and computational time of our approach on a variety of binary classification tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/22/2022

Fast ABC-Boost: A Unified Framework for Selecting the Base Class in Multi-Class Classification

The work in ICML'09 showed that the derivatives of the classical multi-c...
research
09/12/2011

MIS-Boost: Multiple Instance Selection Boosting

In this paper, we present a new multiple instance learning (MIL) method,...
research
01/26/2021

Tree boosting for learning probability measures

Learning probability measures based on an i.i.d. sample is a fundamental...
research
10/09/2015

Functional Frank-Wolfe Boosting for General Loss Functions

Boosting is a generic learning method for classification and regression....
research
05/16/2015

A New Perspective on Boosting in Linear Regression via Subgradient Optimization and Relatives

In this paper we analyze boosting algorithms in linear regression from a...
research
01/02/2022

Succinct Differentiation of Disparate Boosting Ensemble Learning Methods for Prognostication of Polycystic Ovary Syndrome Diagnosis

Prognostication of medical problems using the clinical data by leveragin...
research
06/17/2020

MixBoost: A Heterogeneous Boosting Machine

Modern gradient boosting software frameworks, such as XGBoost and LightG...

Please sign up or login with your details

Forgot password? Click here to reset