Asymptotic conditional inference via a Steining of selection probabilities

11/07/2018
by   Snigdha Panigrahi, et al.
0

Many scientific studies are modeled as hierarchical procedures where the starting point of data-analysis is based on pilot samples that are employed to determine parameters of interest. With the availability of more data, the scientist is tasked with conducting a meta-analysis based on the augmented data-sets, that combines his explorations from the pilot stage with a confirmatory study. Casting these two-staged procedures into a conditional framework, inference is based on a carved likelihood. Such a likelihood is obtained by conditioning the law of the augmented data (from both the stages) upon the selection carried out on the first stage data. In fact, conditional inference in hierarchically-modeled investigations or equivalently, in settings, where some samples are reserved for inference, is asymptotically equivalent to a Gaussian randomization scheme. Identifying the probabilistic behavior of the selection event under Gaussian perturbation to be very different from heavy tailed randomizations in Tian and Taylor (2018), the current work validates carved inference in a model-free asymptotic regime for a broad class of parameters. Our bounds provide significant improvements over existing results on the rate of weak convergence of pivots with Gaussian randomization schemes.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset