Bayesian Neural Tree Models for Nonparametric Regression

09/02/2019
by   Tanujit Chakraborty, et al.
0

Frequentist and Bayesian methods differ in many aspects, but share some basic optimal properties. In real-life classification and regression problems, situations exist in which a model based on one of the methods is preferable based on some subjective criterion. Nonparametric classification and regression techniques, such as decision trees and neural networks, have frequentist (classification and regression trees (CART) and artificial neural networks) as well as Bayesian (Bayesian CART and Bayesian neural networks) approaches to learning from data. In this work, we present two hybrid models combining the Bayesian and frequentist versions of CART and neural networks, which we call the Bayesian neural tree (BNT) models. Both models exploit the architecture of decision trees and have lesser number of parameters to tune than advanced neural networks. Such models can simultaneously perform feature selection and prediction, are highly flexible, and generalize well in settings with a limited number of training observations. We study the consistency of the proposed models, and derive the optimal value of an important model parameter. We also provide illustrative examples using a wide variety of real-life regression data sets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/29/2018

A Nonparametric Ensemble Binary Classifier and its Statistical Properties

In this work, we propose an ensemble of classification trees (CT) and ar...
research
02/21/2023

Variational Boosted Soft Trees

Gradient boosting machines (GBMs) based on decision trees consistently d...
research
10/01/2018

Neural Regression Trees

Regression-via-Classification (RvC) is the process of converting a regre...
research
08/30/2015

X-TREPAN: a multi class regression and adapted extraction of comprehensible decision tree in artificial neural networks

In this work, the TREPAN algorithm is enhanced and extended for extracti...
research
04/25/2016

Neural Random Forests

Given an ensemble of randomized regression trees, it is possible to rest...
research
03/29/2021

One Network Fits All? Modular versus Monolithic Task Formulations in Neural Networks

Can deep learning solve multiple tasks simultaneously, even when they ar...
research
11/10/2021

Classification of the Chess Endgame problem using Logistic Regression, Decision Trees, and Neural Networks

In this study we worked on the classification of the Chess Endgame probl...

Please sign up or login with your details

Forgot password? Click here to reset