Boosting Classifiers with Noisy Inference

09/10/2019
by   Yongjune Kim, et al.
25

We present a principled framework to address resource allocation for realizing boosting algorithms on substrates with communication or computation noise. Boosting classifiers (e.g., AdaBoost) make a final decision via a weighted vote from the outputs of many base classifiers (weak classifiers). Suppose that the base classifiers' outputs are noisy or communicated over noisy channels; these noisy outputs will degrade the final classification accuracy. We show that this degradation can be effectively reduced by allocating more system resources for more important base classifiers. We formulate resource optimization problems in terms of importance metrics for boosting. Moreover, we show that the optimized noisy boosting classifiers can be more robust than bagging for the noise during inference (test stage). We provide numerical evidence to demonstrate the benefits of our approach.

READ FULL TEXT

page 1

page 2

page 3

page 4

page 5

page 6

page 7

page 10

research
09/26/2013

Boosting in the presence of label noise

Boosting is known to be sensitive to label noise. We studied two approac...
research
05/31/2018

A mixture model for aggregation of multiple pre-trained weak classifiers

Deep networks have gained immense popularity in Computer Vision and othe...
research
08/15/2011

A theory of multiclass boosting

Boosting combines weak classifiers to form highly accurate predictors. A...
research
06/09/2011

Inducing Interpretable Voting Classifiers without Trading Accuracy for Simplicity: Theoretical Results, Approximation Algorithms

Recent advances in the study of voting classification algorithms have br...
research
06/01/2021

Analysis of classifiers robust to noisy labels

We explore contemporary robust classification algorithms for overcoming ...
research
05/20/2019

A Distributionally Robust Boosting Algorithm

Distributionally Robust Optimization (DRO) has been shown to provide a f...
research
08/02/2023

When Analytic Calculus Cracks AdaBoost Code

The principle of boosting in supervised learning involves combining mult...

Please sign up or login with your details

Forgot password? Click here to reset