On Sibson's α-Mutual Information

02/08/2022
by   Amedeo Roberto Esposito, et al.
0

We explore a family of information measures that stems from Rényi's α-Divergences with α<0. In particular, we extend the definition of Sibson's α-Mutual Information to negative values of α and show several properties of these objects. Moreover, we highlight how this family of information measures is related to functional inequalities that can be employed in a variety of fields, including lower-bounds on the Risk in Bayesian Estimation Procedures.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/05/2022

Lower-bounds on the Bayesian Risk in Estimation Procedures via f-Divergences

We consider the problem of parameter estimation in a Bayesian setting an...
research
02/08/2022

From Generalisation Error to Transportation-cost Inequalities and Back

In this work, we connect the problem of bounding the expected generalisa...
research
03/15/2022

On Suspicious Coincidences and Pointwise Mutual Information

Barlow (1985) hypothesized that the co-occurrence of two events A and B ...
research
03/22/2023

Lower Bounds on the Bayesian Risk via Information Measures

This paper focuses on parameter estimation and introduces a new method f...
research
12/01/2022

Mutual Information-based Generalized Category Discovery

We introduce an information-maximization approach for the Generalized Ca...
research
04/28/2019

Entropy inequalities and exponential decay of correlations for unique Gibbs measures on trees

In a recent paper by A. Backhausz, B. Gerencsér and V. Harangi, it was s...
research
07/10/2019

The Design of Mutual Information

We derive the functional form of mutual information (MI) from a set of d...

Please sign up or login with your details

Forgot password? Click here to reset