Trade Selection with Supervised Learning and OCA

12/09/2018
by   David Saltiel, et al.
0

In recent years, state-of-the-art methods for supervised learning have exploited increasingly gradient boosting techniques, with mainstream efficient implementations such as xgboost or lightgbm. One of the key points in generating proficient methods is Feature Selection (FS). It consists in selecting the right valuable effective features. When facing hundreds of these features, it becomes critical to select best features. While filter and wrappers methods have come to some maturity, embedded methods are truly necessary to find the best features set as they are hybrid methods combining features filtering and wrapping. In this work, we tackle the problem of finding through machine learning best a priori trades from an algorithmic strategy. We derive this new method using coordinate ascent optimization and using block variables. We compare our method to Recursive Feature Elimination (RFE) and Binary Coordinate Ascent (BCA). We show on a real life example the capacity of this method to select good trades a priori. Not only this method outperforms the initial trading strategy as it avoids taking loosing trades, it also surpasses other method, having the smallest feature set and the highest score at the same time. The interest of this method goes beyond this simple trade classification problem as it is a very general method to determine the optimal feature set using some information about features relationship as well as using coordinate ascent optimization.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/29/2018

Feature selection with optimal coordinate ascent (OCA)

In machine learning, Feature Selection (FS) is a major part of efficient...
research
04/18/2014

Online Group Feature Selection

Online feature selection with dynamic features has become an active rese...
research
03/16/2016

Feature Selection as a Multiagent Coordination Problem

Datasets with hundreds to tens of thousands features is the new norm. Fe...
research
08/21/2016

Online Feature Selection with Group Structure Analysis

Online selection of dynamic features has attracted intensive interest in...
research
02/14/2012

Generalized Fisher Score for Feature Selection

Fisher score is one of the most widely used supervised feature selection...
research
07/02/2021

Few-shot Learning for Unsupervised Feature Selection

We propose a few-shot learning method for unsupervised feature selection...
research
07/22/2020

To Be or Not To Be a Verbal Multiword Expression: A Quest for Discriminating Features

Automatic identification of mutiword expressions (MWEs) is a pre-requisi...

Please sign up or login with your details

Forgot password? Click here to reset