Ryan R. Martin, Chuanhai Liu
Posterior probabilistic statistical inference without priors is an important but so far elusive goal. Fisher�s fiducial inference, Dempster�Shafer theory of belief functions, and Bayesian inference with default priors are attempts to achieve this goal but, to date, none has given a completely satisfactory picture. This article presents a new framework for probabilistic inference, based on inferential models (IMs), which not only provides data-dependent probabilistic measures of uncertainty about the unknown parameter, but also does so with an automatic long-run frequency-calibration property. The key to this new approach is the identification of an unobservable auxiliary variable associated with observable data and unknown parameter, and the prediction of this auxiliary variable with a random set before conditioning on data. Here we present a three-step IM construction, and prove a frequency-calibration property of the IM�s belief function under mild conditions. A corresponding optimality theory is developed, which helps to resolve the nonuniqueness issue. Several examples are presented to illustrate this new approach.
© 2008-2024 Fundación Dialnet · Todos los derechos reservados