Optimization of Oblivious Decision Tree Ensembles Evaluation for CPU

11/01/2022
by   Alexey Mironov, et al.
0

CatBoost is a popular machine learning library. CatBoost models are based on oblivious decision trees, making training and evaluation rapid. CatBoost has many applications, and some require low latency and high throughput evaluation. This paper investigates the possibilities for improving CatBoost's performance in single-core CPU computations. We explore the new features provided by the AVX instruction sets to optimize evaluation. We increase performance by 20-40 using AVX2 instructions without quality impact. We also introduce a new trade-off between speed and quality. Using float16 for leaf values and AVX-512 instructions, we achieve 50-70

READ FULL TEXT
research
05/15/2022

Optimization of Decision Tree Evaluation Using SIMD Instructions

Decision forest (decision tree ensemble) is one of the most popular mach...
research
10/27/2020

GPUTreeShap: Fast Parallel Tree Interpretability

SHAP (SHapley Additive exPlanation) values provide a game theoretic inte...
research
02/05/2020

Fast inference of Boosted Decision Trees in FPGAs for particle physics

We describe the implementation of Boosted Decision Trees in the hls4ml l...
research
04/25/2023

TABLET: Learning From Instructions For Tabular Data

Acquiring high-quality data is often a significant challenge in training...
research
06/20/2017

Index Search Algorithms for Databases and Modern CPUs

Over the years, many different indexing techniques and search algorithms...
research
10/10/2018

uops.info: Characterizing Latency, Throughput, and Port Usage of Instructions on Intel Microarchitectures

Modern microarchitectures are some of the world's most complex man-made ...
research
12/21/2020

From micro-OPs to abstract resources: constructing a simpler CPU performance model through microbenchmarking

In a super-scalar architecture, the scheduler dynamically assigns micro-...

Please sign up or login with your details

Forgot password? Click here to reset