# Bayesian predictive inference without a prior

Let (X_n:n≥ 1) be a sequence of random observations. Let σ_n(·)=P(X_n+1∈·| X_1,…,X_n) be the n-th predictive distribution and σ_0(·)=P(X_1∈·) the marginal distribution of X_1. In a Bayesian framework, to make predictions on (X_n), one only needs the collection σ=(σ_n:n≥ 0). Because of the Ionescu-Tulcea theorem, σ can be assigned directly, without passing through the usual prior/posterior scheme. One main advantage is that no prior probability has to be selected. In this paper, σ is subjected to two requirements: (i) The resulting sequence (X_n) is conditionally identically distributed, in the sense of Berti, Pratelli and Rigo (2004); (ii) Each σ_n+1 is a simple recursive update of σ_n. Various new σ satisfying (i)-(ii) are introduced and investigated. For such σ, the asymptotics of σ_n, as n→∞, is determined. In some cases, the probability distribution of (X_n) is also evaluated.

READ FULL TEXT
Comments

There are no comments yet.