Less Is More: A Comprehensive Framework for the Number of Components of Ensemble Classifiers

09/09/2017
by   Hamed R. Bonab, et al.
0

The number of component classifiers chosen for an ensemble has a great impact on its prediction ability. In this paper, we use a geometric framework for a priori determining the ensemble size, applicable to most of the existing batch and online ensemble classifiers. There are only a limited number of studies on the ensemble size considering Majority Voting (MV) and Weighted Majority Voting (WMV). Almost all of them are designed for batch-mode, barely addressing online environments. The big data dimensions and resource limitations in terms of time and memory make the determination of the ensemble size crucial, especially for online environments. Our framework proves, for the MV aggregation rule, that the more strong components we can add to the ensemble the more accurate predictions we can achieve. On the other hand, for the WMV aggregation rule, we prove the existence of an ideal number of components equal to the number of class labels, with the premise that components are completely independent of each other and strong enough. While giving the exact definition for a strong and independent classifier in the context of an ensemble is a challenging task, our proposed geometric framework provides a theoretical explanation of diversity and its impact on the accuracy of predictions. We conduct an experimental evaluation with two different scenarios to show the practical value of our theorems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/16/2021

A Dataset-Level Geometric Framework for Ensemble Classifiers

Ensemble classifiers have been investigated by many in the artificial in...
research
08/27/2023

Leveraging Linear Independence of Component Classifiers: Optimizing Size and Prediction Accuracy for Online Ensembles

Ensembles, which employ a set of classifiers to enhance classification a...
research
07/09/2021

Specialists Outperform Generalists in Ensemble Classification

Consider an ensemble of k individual classifiers whose accuracies are kn...
research
02/27/2019

Unifying Ensemble Methods for Q-learning via Social Choice Theory

Ensemble methods have been widely applied in Reinforcement Learning (RL)...
research
04/23/2021

Selecting a number of voters for a voting ensemble

For a voting ensemble that selects an odd-sized subset of the ensemble c...
research
05/14/2019

Resource-aware Elastic Swap Random Forest for Evolving Data Streams

Continual learning based on data stream mining deals with ubiquitous sou...
research
04/08/2019

Optimizing Majority Voting Based Systems Under a Resource Constraint for Multiclass Problems

Ensemble-based approaches are very effective in various fields in raisin...

Please sign up or login with your details

Forgot password? Click here to reset