Partial Conditioning for Inferential Models

01/11/2023
by   Jiasen Yang, et al.
0

Inferential models have been proposed for valid and efficient prior-free probabilistic inference. As it gradually gained popularity, this theory is subject to further developments for practically challenging problems. This paper considers the many-normal-means problem with the means constrained to be in the neighborhood of each other. A new method, called partial conditioning, is proposed to generate valid and efficient marginal inference about the individual means. It is shown that the method outperforms both a fiducial-counterpart in terms of validity and a conservative-counterpart in terms of efficiency.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/21/2000

To Preference via Entrenchment

We introduce a simple generalization of Gardenfors and Makinson's episte...
research
10/01/2020

Probabilistic Programs with Stochastic Conditioning

We tackle the problem of conditioning probabilistic programs on distribu...
research
03/13/2022

Valid and efficient imprecise-probabilistic inference across a spectrum of partial prior information

Bayesian inference quantifies uncertainty directly and formally using cl...
research
03/13/2013

Reformulating Inference Problems Through Selective Conditioning

We describe how we selectively reformulate portions of a belief network ...
research
11/26/2022

Valid and efficient imprecise-probabilistic inference with partial priors, II. General framework

Bayesian inference requires specification of a single, precise prior dis...
research
03/24/2022

A more flexible counterpart of a Huang-Kotz's copula-type

We propose a more flexible symmetric counterpart of the Huang-Kotz's cop...

Please sign up or login with your details

Forgot password? Click here to reset