On the relative asymptotic expressivity of inference frameworks
Let Ο be a first-order signature and let π_n be the set of all Ο-structures with domain [n] = {1, β¦, n}. By an inference framework we mean a class π of pairs (β, L), where β = (β_n : n = 1, 2, 3, β¦) and each β_n is a probability distribution on π_n, and L is a logic with truth values in the unit interval [0, 1]. The inference frameworks we consider contain pairs (β, L) where β is determined by a probabilistic graphical model and and L expresses statements about, for example, (conditional) probabilities or (arithmetic or geometric) averages. We define asymptotic expressivity of inference frameworks: π ' is asymptotically at least as expressive as π if for every (β, L) βπ there is (β', L') βπ ' such that β is asymptotically total-variation-equivalent to β' and for every Ο(xΜ ) β L there is Ο'(xΜ ) β L' such that Ο'(xΜ ) is asymptotically equivalent to Ο(xΜ ) with respect to β. This relation is a preorder and we describe a (strict) partial order on the equivalence classes of some inference frameworks that seem natural in the context of machine learning and artificial intelligence. Our analysis includes Conditional Probability Logic (CPL) and Probability Logic with Aggregation functions (PLA) introduced in earlier work. We also define a sublogic coPLA of PLA in which the aggregation functions satisfy additional continuity constraints and show that coPLA gives rise to asymptotically strictly less expressive inference frameworks than PLA.
READ FULL TEXT