Joints in Random Forests

06/25/2020
by   Alvaro H. C. Correia, et al.
0

Decision Trees (DTs) and Random Forests (RFs) are powerful discriminative learners and tools of central importance to the everyday machine learning practitioner and data scientist. Due to their discriminative nature, however, they lack principled methods to process inputs with missing features or to detect outliers, which requires pairing them with imputation techniques or a separate generative model. In this paper, we demonstrate that DTs and RFs can naturally be interpreted as generative models, by drawing a connection to Probabilistic Circuits, a prominent class of tractable probabilistic models. This reinterpretation equips them with a full joint distribution over the feature space and leads to Generative Decision Trees (GeDTs) and Generative Forests (GeFs), a family of novel hybrid generative-discriminative models. This family of models retains the overall characteristics of DTs and RFs while additionally being able to handle missing features by means of marginalisation. Under certain assumptions, frequently made for Bayes consistency results, we show that consistency in GeDTs and GeFs extend to any pattern of missing input features, if missing at random. Empirically, we show that our models often outperform common routines to treat missing data, such as K-nearest neighbour imputation, and moreover, that our models can naturally detect outliers by monitoring the marginal probability of input features.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/11/2020

Towards Robust Classification with Deep Generative Forests

Decision Trees and Random Forests are among the most widely used machine...
research
06/29/2020

Handling Missing Data in Decision Trees: A Probabilistic Approach

Decision trees are a popular family of models due to their attractive pr...
research
02/19/2019

On the consistency of supervised learning with missing values

In many application settings, the data are plagued with missing features...
research
08/16/2023

Deep Generative Imputation Model for Missing Not At Random Data

Data analysis usually suffers from the Missing Not At Random (MNAR) prob...
research
10/05/2019

On Tractable Computation of Expected Predictions

Computing expected predictions has many interesting applications in area...
research
06/03/2013

Prediction with Missing Data via Bayesian Additive Regression Trees

We present a method for incorporating missing data in non-parametric sta...
research
01/31/2022

Deep discriminative to kernel generative modeling

The fight between discriminative versus generative goes deep, in both th...

Please sign up or login with your details

Forgot password? Click here to reset