DeepAI AI Chat
Log In Sign Up

Three Approaches to Probability Model Selection

by   William B. Poland, et al.

This paper compares three approaches to the problem of selecting among probability models to fit data (1) use of statistical criteria such as Akaike's information criterion and Schwarz's "Bayesian information criterion," (2) maximization of the posterior probability of the model, and (3) maximization of an effectiveness ratio? trading off accuracy and computational cost. The unifying characteristic of the approaches is that all can be viewed as maximizing a penalized likelihood function. The second approach with suitable prior distributions has been shown to reduce to the first. This paper shows that the third approach reduces to the second for a particular form of the effectiveness ratio, and illustrates all three approaches with the problem of selecting the number of components in a mixture of Gaussian distributions. Unlike the first two approaches, the third can be used even when the candidate models are chosen for computational efficiency, without regard to physical interpretation, so that the likelihood and the prior distribution over models cannot be interpreted literally. As the most general and computationally oriented of the approaches, it is especially useful for artificial intelligence applications.


page 1

page 2

page 3

page 4

page 5


Multi-model mimicry for model selection according to generalised goodness-of-fit criteria

Selecting between candidate models is at the core of statistical practic...

On the Equivalence of Factorized Information Criterion Regularization and the Chinese Restaurant Process Prior

Factorized Information Criterion (FIC) is a recently developed informati...

Information criteria for inhomogeneous spatial point processes

The theoretical foundation for a number of model selection criteria is e...

Parametric Modeling Approach to COVID-19 Pandemic Data

The problem of skewness is common among clinical trials and survival dat...

A Novel Bayesian Cluster Enumeration Criterion for Unsupervised Learning

The Bayesian Information Criterion (BIC) has been widely used for estima...

Criterion for the resemblance between the mother and the model distribution

If the probability distribution model aims to approximate the hidden mot...