DeepAI AI Chat
Log In Sign Up

Boosting as a Product of Experts

by   Narayanan U. Edakunni, et al.

In this paper, we derive a novel probabilistic model of boosting as a Product of Experts. We re-derive the boosting algorithm as a greedy incremental model selection procedure which ensures that addition of new experts to the ensemble does not decrease the likelihood of the data. These learning rules lead to a generic boosting algorithm - POE- Boost which turns out to be similar to the AdaBoost algorithm under certain assumptions on the expert probabilities. The paper then extends the POEBoost algorithm to POEBoost.CS which handles hypothesis that produce probabilistic predictions. This new algorithm is shown to have better generalization performance compared to other state of the art algorithms.


Minimal Variance Sampling in Stochastic Gradient Boosting

Stochastic Gradient Boosting (SGB) is a widely used approach to regulari...

Functional Frank-Wolfe Boosting for General Loss Functions

Boosting is a generic learning method for classification and regression....

Deep Incremental Boosting

This paper introduces Deep Incremental Boosting, a new technique derived...

Distorted English Alphabet Identification : An application of Difference Boosting Algorithm

The difference-boosting algorithm is used on letters dataset from the UC...

A Bagging and Boosting Based Convexly Combined Optimum Mixture Probabilistic Model

Unlike previous studies on mixture distributions, a bagging and boosting...

Boosting algorithms for uplift modeling

Uplift modeling is an area of machine learning which aims at predicting ...

Model Agnostic Combination for Ensemble Learning

Ensemble of models is well known to improve single model performance. We...