IPAD: Stable Interpretable Forecasting with Knockoffs Inference

09/06/2018
by   Yingying Fan, et al.
0

Interpretability and stability are two important features that are desired in many contemporary big data applications arising in economics and finance. While the former is enjoyed to some extent by many existing forecasting approaches, the latter in the sense of controlling the fraction of wrongly discovered features which can enhance greatly the interpretability is still largely underdeveloped in the econometric settings. To this end, in this paper we exploit the general framework of model-X knockoffs introduced recently in Candès, Fan, Janson and Lv (2018), which is nonconventional for reproducible large-scale inference in that the framework is completely free of the use of p-values for significance testing, and suggest a new method of intertwined probabilistic factors decoupling (IPAD) for stable interpretable forecasting with knockoffs inference in high-dimensional models. The recipe of the method is constructing the knockoff variables by assuming a latent factor model that is exploited widely in economics and finance for the association structure of covariates. Our method and work are distinct from the existing literature in that we estimate the covariate distribution from data instead of assuming that it is known when constructing the knockoff variables, our procedure does not require any sample splitting, we provide theoretical justifications on the asymptotic false discovery rate control, and the theory for the power analysis is also established. Several simulation examples and the real data analysis further demonstrate that the newly suggested method has appealing finite-sample performance with desired interpretability and stability compared to some popularly used forecasting methods.

READ FULL TEXT
research
08/31/2017

RANK: Large-Scale Inference with Graphical Nonlinear Knockoffs

Power and reproducibility are key to enabling refined scientific discove...
research
07/04/2022

FACT: High-Dimensional Random Forests Inference

Random forests is one of the most widely used machine learning methods o...
research
12/18/2021

High-Dimensional Knockoffs Inference for Time Series Data

The framework of model-X knockoffs provides a flexible tool for exact fi...
research
05/23/2022

A General Framework for Powerful Confounder Adjustment in Omics Association Studies

Genomic data are subject to various sources of confounding, such as batc...
research
08/30/2019

Nodewise Knockoffs: False Discovery Rate Control for Gaussian Graphical Models

Controlling the false discovery rate (FDR) is important for obtaining re...
research
01/11/2018

Robust inference with knockoffs

We consider the variable selection problem, which seeks to identify impo...
research
06/13/2016

Tuning-Free Heterogeneity Pursuit in Massive Networks

Heterogeneity is often natural in many contemporary applications involvi...

Please sign up or login with your details

Forgot password? Click here to reset