Codivergences and information matrices

03/14/2023
by   Alexis Derumigny, et al.
0

We propose a new concept of codivergence, which quantifies the similarity between two probability measures P_1, P_2 relative to a reference probability measure P_0. In the neighborhood of the reference measure P_0, a codivergence behaves like an inner product between the measures P_1 - P_0 and P_2 - P_0. Two specific codivergences, the χ^2-codivergence and the Hellinger codivergence are introduced and studied. We derive explicit expressions for several common parametric families of probability distributions. For a codivergence, we introduce moreover the divergence matrix as an analogue of the Gram matrix. It is shown that the χ^2-divergence matrix satisfies a data-processing inequality.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/31/2021

Optimized quantum f-divergences

The quantum relative entropy is a measure of the distinguishability of t...
research
08/31/2018

Improved Chebyshev inequality: new probability bounds with known supremum of PDF

In this paper, we derive new probability bounds for Chebyshev's inequali...
research
07/25/2022

Information Processing Equalities and the Information-Risk Bridge

We introduce two new classes of measures of information for statistical ...
research
11/04/2017

Language as a matrix product state

We propose a statistical model for natural language that begins by consi...
research
08/16/2018

Divergence functions in dually flat spaces and their properties

Dually flat spaces play a key role in the differential geometrical appro...
research
12/02/2021

The Representation Jensen-Rényi Divergence

We introduce a divergence measure between data distributions based on op...
research
09/24/2021

Estimating Rényi's α-Cross-Entropies in a Matrix-Based Way

Conventional information-theoretic quantities assume access to probabili...

Please sign up or login with your details

Forgot password? Click here to reset