The Success of AdaBoost and Its Application in Portfolio Management

03/23/2021
by   Yijian Chuan, et al.
0

We develop a novel approach to explain why AdaBoost is a successful classifier. By introducing a measure of the influence of the noise points (ION) in the training data for the binary classification problem, we prove that there is a strong connection between the ION and the test error. We further identify that the ION of AdaBoost decreases as the iteration number or the complexity of the base learners increases. We confirm that it is impossible to obtain a consistent classifier without deep trees as the base learners of AdaBoost in some complicated situations. We apply AdaBoost in portfolio management via empirical studies in the Chinese market, which corroborates our theoretical propositions.

READ FULL TEXT

page 22

page 25

page 28

research
06/25/2021

Self-training Converts Weak Learners to Strong Learners in Mixture Models

We consider a binary classification problem when the data comes from a m...
research
06/09/2022

Diagnosing Ensemble Few-Shot Classifiers

The base learners and labeled samples (shots) in an ensemble few-shot cl...
research
07/01/2020

An ensemble learning framework based on group decision making

The classification problem is a significant topic in machine learning wh...
research
03/07/2021

Automatic Difficulty Classification of Arabic Sentences

In this paper, we present a Modern Standard Arabic (MSA) Sentence diffic...
research
03/21/2023

A Random Projection k Nearest Neighbours Ensemble for Classification via Extended Neighbourhood Rule

Ensembles based on k nearest neighbours (kNN) combine a large number of ...
research
12/23/2019

On Information Coefficient and Directional Statistics

Cross-sectional "Information Coefficient"(IC) is a widely and deeply acc...

Please sign up or login with your details

Forgot password? Click here to reset