What is a Posterior Predictive Distribution?
The posterior predictive is a distribution for predicting future, unknown data values based upon the data currently available. In Bayesian inference, a prior probability assumption is tested and updated with new observations from a sample. This generates a posterior probability distribution. Once this distribution is graphed, the next parameter values (points on the graph) can be predicted. Confidence in the predication is expressed as the likelihood function determined by the type of probability distribution chosen for the posterior probability.
In frequentist inference, this can still be used, as long as the likelihood function and prior probability are conjugates, i.e. expressed using the same parameter distribution technique.
Common Probability Distribution Parameterizations in Machine Learning:
While all probability models follow either Bayesian or Frequentist inference, they can yield vastly different results depending upon what specific parameter distribution algorithm is employed.
- Bernoulli distribution – one parameter
- Beta distribution – multiple parameters
- Binomial distribution – two parameters
- Exponential distribution – multiple parameters
- Gamma distribution – multiple parameters
- Geometric distribution – one parameter
Gaussian (normal) distribution – multiple parameters
- Lognormal distribution – one parameter
- Negative binomial distribution – two parameters
- Poisson distribution – one parameter