What are Empirical Bayes Methods?
Empirical Bayes methods are a collection of ways to estimate and update the parameters of a prior probability before creating a posterior probability distribution. This technique still follows the general Bayesian statistics model, but turns the process of estimating initial assumptions (prior probability) into a two-step procedure. Empirical Bayes estimation is used instead of the Maximum Entropy Principle when more than one parameter is known, but still not enough is known to create a fixed-point probability distribution without subjective guesswork.
How Do Empirical Bayes Methods Work?
There are different Bayes estimation techniques for each type of probability distribution, but all share the same basic format:
Step 1: Create hyperparameters (probability distributions) instead of fixed values for each parameter in a prior assumption.
Step 2: Test the prior probability on a sample of data, which turns the hyperparameters into an approximate value for each parameter.
Step 3: Use this updated prior assumption, which is technically a posterior probability, as a prior probability when running the model on the full data set.
One advantage of this approach is that you can still have a conjugate prior/posterior relationship even if the initial hyperparameters and the prior probability parameters use different distributions, since only the final outcome is used in the model.