Misclassification cost-sensitive ensemble learning: A unifying framework

07/14/2020
by   George Petrides, et al.
0

Over the years, a plethora of cost-sensitive methods have been proposed for learning on data when different types of misclassification errors incur different costs. Our contribution is a unifying framework that provides a comprehensive and insightful overview on cost-sensitive ensemble methods, pinpointing their differences and similarities via a fine-grained categorization. Our framework contains natural extensions and generalisations of ideas across methods, be it AdaBoost, Bagging or Random Forest, and as a result not only yields all methods known to date but also some not previously considered.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/04/2023

Cost-Sensitive Stacking: an Empirical Evaluation

Many real-world classification problems are cost-sensitive in nature, su...
research
10/06/2019

Weighted Clustering Ensemble: A Review

Clustering ensemble has emerged as a powerful tool for improving both th...
research
11/16/2016

Cost-Sensitive Deep Learning with Layer-Wise Cost Estimation

While deep neural networks have succeeded in several visual applications...
research
07/15/2015

Untangling AdaBoost-based Cost-Sensitive Classification. Part I: Theoretical Perspective

Boosting algorithms have been widely used to tackle a plethora of proble...
research
09/07/2018

Multi-Target Prediction: A Unifying View on Problems and Methods

Multi-target prediction (MTP) is concerned with the simultaneous predict...
research
04/05/2020

An Unsupervised Random Forest Clustering Technique for Automatic Traffic Scenario Categorization

A modification of the Random Forest algorithm for the categorization of ...
research
06/08/2020

Propositionalization and Embeddings: Two Sides of the Same Coin

Data preprocessing is an important component of machine learning pipelin...

Please sign up or login with your details

Forgot password? Click here to reset