Sliced Inverse Regression with Large Structural Dimensions
The central space of a joint distribution (,Y) is the minimal subspace 𝒮 such that Y⊥⊥| P_𝒮 where P_𝒮 is the projection onto 𝒮. Sliced inverse regression (SIR), one of the most popular methods for estimating the central space, often performs poorly when the structural dimension d=dim( 𝒮) is large (e.g., 5). In this paper, we demonstrate that the generalized signal-noise-ratio (gSNR) tends to be extremely small for a general multiple-index model when d is large. Then we determine the minimax rate for estimating the central space over a large class of high dimensional distributions with a large structural dimension d (i.e., there is no constant upper bound on d) in the low gSNR regime. This result not only extends the existing minimax rate results for estimating the central space of distributions with fixed d to that with a large d, but also clarifies that the degradation in SIR performance is caused by the decay of signal strength. The technical tools developed here might be of independent interest for studying other central space estimation methods.
READ FULL TEXT