FACT: High-Dimensional Random Forests Inference

07/04/2022
by   Chien-Ming Chi, et al.
0

Random forests is one of the most widely used machine learning methods over the past decade thanks to its outstanding empirical performance. Yet, because of its black-box nature, the results by random forests can be hard to interpret in many big data applications. Quantifying the usefulness of individual features in random forests learning can greatly enhance its interpretability. Existing studies have shown that some popularly used feature importance measures for random forests suffer from the bias issue. In addition, there lack comprehensive size and power analyses for most of these existing methods. In this paper, we approach the problem via hypothesis testing, and suggest a framework of the self-normalized feature-residual correlation test (FACT) for evaluating the significance of a given feature in the random forests model with bias-resistance property, where our null hypothesis concerns whether the feature is conditionally independent of the response given all other features. Such an endeavor on random forests inference is empowered by some recent developments on high-dimensional random forests consistency. The vanilla version of our FACT test can suffer from the bias issue in the presence of feature dependency. We exploit the techniques of imbalancing and conditioning for bias correction. We further incorporate the ensemble idea into the FACT statistic through feature transformations for the enhanced power. Under a fairly general high-dimensional nonparametric model setting with dependent features, we formally establish that FACT can provide theoretically justified random forests feature p-values and enjoy appealing power through nonasymptotic analyses. The theoretical results and finite-sample advantages of the newly suggested method are illustrated with several simulation examples and an economic forecasting application in relation to COVID-19.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/16/2019

Scalable and Efficient Hypothesis Testing with Random Forests

Throughout the last decade, random forests have established themselves a...
research
03/05/2022

Fuzzy Forests For Feature Selection in High-Dimensional Survey Data: An Application to the 2020 U.S. Presidential Election

An increasingly common methodological issue in the field of social scien...
research
12/05/2022

Testing for Regression Heteroskedasticity with High-Dimensional Random Forests

Statistical inference for high-dimensional regression heteroskedasticity...
research
09/06/2018

IPAD: Stable Interpretable Forecasting with Knockoffs Inference

Interpretability and stability are two important features that are desir...
research
12/17/2013

Markov Network Structure Learning via Ensemble-of-Forests Models

Real world systems typically feature a variety of different dependency t...
research
08/08/2019

Random Sum-Product Forests with Residual Links

Tractable yet expressive density estimators are a key building block of ...
research
07/21/2018

FDR-HS: An Empirical Bayesian Identification of Heterogenous Features in Neuroimage Analysis

Recent studies found that in voxel-based neuroimage analysis, detecting ...

Please sign up or login with your details

Forgot password? Click here to reset