Representing Pareto optima in preordered spaces: from Shannon entropy to injective monotones

07/30/2021
by   Pedro Hack, et al.
0

Shannon entropy is the most widely used measure of uncertainty. It is used, for example, in Jaynes' maximum entropy principle which is considered a basis for statistical inference and serves as a justification for many regularization techniques that appear throughout machine learning and decision theory. Entropy is, however, only one possible monotone that does not fully capture the more fundamental notion of uncertainty considered as a preorder on the space of probability distributions, also known as majorization. While the maximum entropy principle therefore cannot yield all Pareto optima of the uncertainty preorder in general, it has the appealing property that its solutions are unique Pareto optima, since it maximizes a strictly concave functional over a convex subset. Here, we investigate a class of monotones on general preordered spaces that preserve this uniqueness property (up to order equivalence) on any subset, without asking for the additional vector space structure required for convexity. We show that the class of preorders for which these so-called injective monotones exist, lies in between the class of preorders with strict monotones and preorders with utility functions. We extend several well-known results for strict monotones (Richter-Peleg functions) to injective monotones, we provide a construction of injective monotones from countable multi-utilities, and relate injective monotones to classic results concerning Debreu denseness and order separability. Along the way, we connect our results to Shannon entropy and the uncertainty preorder, obtaining new insights into how they are related.

READ FULL TEXT
research
04/20/2022

An entropy functional bounded from above by one

Shannon entropy is widely used for quantifying uncertainty in discrete r...
research
03/01/2018

Re-examination of Bregman functions and new properties of their divergences

The Bregman divergence (Bregman distance, Bregman measure of distance) i...
research
08/29/2019

The maximum entropy of a metric space

We define a one-parameter family of entropies, each assigning a real num...
research
09/15/2020

A functorial characterization of von Neumann entropy

We classify the von Neumann entropy as a certain concave functor from fi...
research
07/11/2021

Jaynes Shannon's Constrained Ignorance and Surprise

In this simple article, with possible applications in theoretical and ap...
research
10/13/2020

Entropy-based test for generalized Gaussian distributions

In this paper, we provide the proof of L^2 consistency for the kth neare...
research
07/13/2018

A combinatorial interpretation for Tsallis 2-entropy

While Shannon entropy is related to the growth rate of multinomial coeff...

Please sign up or login with your details

Forgot password? Click here to reset