Probability Update: Conditioning vs. Cross-Entropy

by   Adam J. Grove, et al.

Conditioning is the generally agreed-upon method for updating probability distributions when one learns that an event is certainly true. But it has been argued that we need other rules, in particular the rule of cross-entropy minimization, to handle updates that involve uncertain information. In this paper we re-examine such a case: van Fraassen's Judy Benjamin problem, which in essence asks how one might update given the value of a conditional probability. We argue that -- contrary to the suggestions in the literature -- it is possible to use simple conditionalization in this case, and thereby obtain answers that agree fully with intuition. This contrasts with proposals such as cross-entropy, which are easier to apply but can give unsatisfactory answers. Based on the lessons from this example, we speculate on some general philosophical issues concerning probability update.


page 1

page 3

page 4


Evaluation of Uncertain Inference Models I: PROSPECTOR

This paper examines the accuracy of the PROSPECTOR model for uncertain r...

Improvement of the cross-entropy method in high dimension through a one-dimensional projection without gradient estimation

Rare event probability estimation is an important topic in reliability a...

Updating Probabilities

As examples such as the Monty Hall puzzle show, applying conditioning to...

Detection of elliptical shapes via cross-entropy clustering

The problem of finding elliptical shapes in an image will be considered....

Set Cross Entropy: Likelihood-based Permutation Invariant Loss Function for Probability Distributions

We propose a permutation-invariant loss function designed for the neural...

Cross-Entropy method: convergence issues for extended implementation

The cross-entropy method (CE) developed by R. Rubinstein is an elegant p...

Experimentally Comparing Uncertain Inference Systems to Probability

This paper examines the biases and performance of several uncertain infe...