Neural Attention Forests: Transformer-Based Forest Improvement

04/12/2023
by   Andrei V. Konstantinov, et al.
0

A new approach called NAF (the Neural Attention Forest) for solving regression and classification tasks under tabular training data is proposed. The main idea behind the proposed NAF model is to introduce the attention mechanism into the random forest by assigning attention weights calculated by neural networks of a specific form to data in leaves of decision trees and to the random forest itself in the framework of the Nadaraya-Watson kernel regression. In contrast to the available models like the attention-based random forest, the attention weights and the Nadaraya-Watson regression are represented in the form of neural networks whose weights can be regarded as trainable parameters. The first part of neural networks with shared weights is trained for all trees and computes attention weights of data in leaves. The second part aggregates outputs of the tree networks and aims to minimize the difference between the random forest prediction and the truth target value from a training set. The neural network is trained in an end-to-end manner. The combination of the random forest and neural networks implementing the attention mechanism forms a transformer for enhancing the forest predictions. Numerical experiments with real datasets illustrate the proposed method. The code implementing the approach is publicly available.

READ FULL TEXT
research
01/08/2022

Attention-based Random Forest and Contamination Model

A new approach called ABRF (the attention-based random forest) and its m...
research
02/13/2023

Multiple Instance Learning with Trainable Decision Tree Ensembles

A new random forest based model for solving the Multiple Instance Learni...
research
11/25/2019

Neural Random Forest Imitation

We present Neural Random Forest Imitation - a novel approach for transfo...
research
04/25/2016

Neural Random Forests

Given an ensemble of randomized regression trees, it is possible to rest...
research
10/11/2022

LARF: Two-level Attention-based Random Forests with a Mixture of Contamination Models

New models of the attention-based random forests called LARF (Leaf Atten...
research
03/16/2020

A Numerical Transform of Random Forest Regressors corrects Systematically-Biased Predictions

Over the past decade, random forest models have become widely used as a ...
research
02/16/2021

Trees-Based Models for Correlated Data

This paper presents a new approach for trees-based regression, such as s...

Please sign up or login with your details

Forgot password? Click here to reset