A New Technique for Combining Multiple Classifiers using The Dempster-Shafer Theory of Evidence

06/30/2011
by   A. Al-Ani, et al.
0

This paper presents a new classifier combination technique based on the Dempster-Shafer theory of evidence. The Dempster-Shafer theory of evidence is a powerful method for combining measures of evidence from different classifiers. However, since each of the available methods that estimates the evidence of classifiers has its own limitations, we propose here a new implementation which adapts to training data so that the overall mean square error is minimized. The proposed technique is shown to outperform most available classifier combination methods when tested on three different classification problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/18/2014

A new combination approach based on improved evidence distance

Dempster-Shafer evidence theory is a powerful tool in information fusion...
research
11/06/2018

An Experiment with Bands and Dimensions in Classifiers

This paper presents a new version of an oscillating error classifier tha...
research
08/02/2023

When Analytic Calculus Cracks AdaBoost Code

The principle of boosting in supervised learning involves combining mult...
research
03/27/2013

Modifiable Combining Functions

Modifiable combining functions are a synthesis of two common approaches ...
research
10/27/2017

Probability Series Expansion Classifier that is Interpretable by Design

This work presents a new classifier that is specifically designed to be ...
research
05/19/2020

Quantifying the Uncertainty of Precision Estimates for Rule based Text Classifiers

Rule based classifiers that use the presence and absence of key sub-stri...
research
04/21/2020

Combining Deep Learning Classifiers for 3D Action Recognition

The popular task of 3D human action recognition is almost exclusively so...

Please sign up or login with your details

Forgot password? Click here to reset