Untangling AdaBoost-based Cost-Sensitive Classification. Part II: Empirical Analysis

07/15/2015
by   Iago Landesa-Vázquez, et al.
0

A lot of approaches, each following a different strategy, have been proposed in the literature to provide AdaBoost with cost-sensitive properties. In the first part of this series of two papers, we have presented these algorithms in a homogeneous notational framework, proposed a clustering scheme for them and performed a thorough theoretical analysis of those approaches with a fully theoretical foundation. The present paper, in order to complete our analysis, is focused on the empirical study of all the algorithms previously presented over a wide range of heterogeneous classification problems. The results of our experiments, confirming the theoretical conclusions, seem to reveal that the simplest approach, just based on cost-sensitive weight initialization, is the one showing the best and soundest results, despite having been recurrently overlooked in the literature.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro