On conditional Sibson's α-Mutual Information

02/01/2021
by   Amedeo Roberto Esposito, et al.
0

In this work, we analyse how to define a conditional version of Sibson's α-Mutual Information. Several such definitions can be advanced and they all lead to different information measures with different (but similar) operational meanings. We will analyse in detail one such definition, compute a closed-form expression for it and endorse it with an operational meaning while also considering some applications. The alternative definitions will also be mentioned and compared.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/10/2020

On the Maximum Mutual Information Capacity of Neural Architectures

We derive the closed-form expression of the maximum mutual information -...
research
07/25/2018

Channel Dependent Mutual Information in Index Modulations

Mutual Information is the metric that is used to perform link adaptation...
research
12/15/2019

Relaxed Wyner's Common Information

A natural relaxation of Wyner's Common Information is studied. Specifica...
research
12/30/2021

Studying the Interplay between Information Loss and Operation Loss in Representations for Classification

Information-theoretic measures have been widely adopted in the design of...
research
05/02/2020

Conditional Rényi entropy and the relationships between Rényi capacities

The analogues of Arimoto's definition of conditional Rényi entropy and R...
research
07/10/2019

The Design of Mutual Information

We derive the functional form of mutual information (MI) from a set of d...
research
08/04/2018

Implementation and Analysis of Stable PUFs Using Gate Oxide Breakdown

We implement and analyze highly stable PUFs using two random gate oxide ...

Please sign up or login with your details

Forgot password? Click here to reset