The Optimal 'AND'

05/24/2020
by   Richard Rohwer, et al.
0

The joint distribution P(X,Y) cannot be determined from its marginals P(X) and P(Y) alone; one also needs one of the conditionals P(X|Y) or P(Y|X). But is there a best guess, given only the marginals? Here we answer this question in the affirmative, obtaining in closed form the function of the marginals that has the lowest expected Kullbach-Liebler (KL) divergence between the unknown "true" joint probability and the function value. The expectation is taken with respect to Jeffreys' non-informative prior over the possible joint probability values, given the marginals. This distribution can also be used to obtain the expected information loss for any other "aggregation operator", as such estimators are often called in fuzzy logic, for any given pair of marginal input values. This enables such such operators, including ours, to be compared according to their expected loss under the minimal knowledge conditions we assume. We go on to develop a method for evaluating the expected accuracy of any aggregation operator in the absence of knowledge of its inputs. This requires averaging the expected loss over all possible input pairs, weighted by an appropriate distribution. We obtain this distribution by marginalizing Jeffreys' prior over the possible joint distributions (over the 3 functionally independent coordinates of the space of joint distributions over two Boolean variables) onto a joint distribution over the pair of marginal distributions, a 2-dimensional space with one parameter for each marginal. We report the resulting input-averaged expected losses for a few commonly used operators, as well as the optimal operator. Finally, we discuss the potential to develop our methodology into a principled risk management approach to replace the often rather arbitrary conditional-independence assumptions made for probabilistic graphical models.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset