Controlling the False Discovery Rate in Structural Sparsity: Split Knockoffs
Controlling the False Discovery Rate (FDR) in a variable selection procedure is critical for reproducible discoveries, which receives an extensive study in sparse linear models. However, in many scenarios, the sparsity constraint is not directly imposed on the parameters, but on a linear transformation of the parameters to be estimated. Examples can be found in total variations, wavelet transforms, fused LASSO, and trend filtering, etc. In this paper, we proposed a data adaptive FDR control in this structural sparsity setting, the Split Knockoff method. The proposed scheme relaxes the linear subspace constraint to its neighborhood, often known as variable splitting in optimization, that enjoys new statistical benefits. It yields orthogonal design and split knockoff matrices, that exhibit desired FDR control empirically in structural sparsity discovery, and improve the power of strong feature selection by enhancing the incoherence condition for model selection consistency. Yet, the split knockoff statistics fail to satisfy the exchangeability, a crucial property in the classical knockoff method for provable FDR control. To address this challenge, we introduce an almost supermartingale construction under a perturbation of exchangeability, that enables us to establish FDR control up to an arbitrarily small inflation that vanishes as the relaxed neighborhood enlarges. Simulation experiments show the effectiveness of split knockoffs with possible improvements over knockoffs in both FDR control and Power. An application to Alzheimer's Disease study with MRI data demonstrates that the split knockoff method can disclose important lesion regions in brains associated with the disease and connections between neighboring regions of high contrast variations during disease progression.
READ FULL TEXT