Moderate-Dimensional Inferences on Quadratic Functionals in Ordinary Least Squares

10/02/2018 ∙ by Xiao Guo, et al. ∙ 0

Statistical inferences on quadratic functionals of linear regression parameter have found wide applications including signal detection, one/two-sample global testing, inference of fraction of variance explained and genetic co-heritability. Conventional theory based on ordinary least squares estimator works perfectly in the fixed-dimensional regime, but fails when the parameter dimension p_n grows proportionally to the sample size n. In some cases, its performance is not satisfactory even when n≥ 5p_n. The main contribution of this paper is to illustrate that signal-to-noise ratio (SNR) plays a crucial role in the moderate-dimensional inferences where _n→∞ p_n/n = τ∈ (0, 1). In the case of weak SNR, as often occurred in the moderate-dimensional regime, both bias and variance need to be corrected in the traditional inference procedures. The amount of correction mainly depends on SNR and τ, and could be fairly large as τ→1. However, the classical fixed-dimensional results continue to hold if and only if SNR is large enough, say when p_n diverges but slower than n. Our general theory holds, in particular, without Gaussian design/error or structural parameter assumption, and apply to a broad class of quadratical functionals, covering all aforementioned applications. The mathematical arguments are based on random matrix theory and leave-one-out method. Extensive numerical results demonstrate the satisfactory performances of the proposed methodology even when p_n≥ 0.9n in some extreme case.



There are no comments yet.


page 4

page 24

page 25

page 27

page 29

page 30

page 31

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.