Factorized Mutual Information Maximization

06/13/2019
by   Thomas Merkh, et al.
0

We investigate the sets of joint probability distributions that maximize the average multi-information over a collection of margins. These functionals serve as proxies for maximizing the multi-information of a set of variables or the mutual information of two subsets of variables, at a lower computation and estimation complexity. We describe the maximizers and their relations to the maximizers of the multi-information and the mutual information.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/21/2021

On Study of Mutual Information and its Estimation Methods

The presence of mutual information in the research of deep learning has ...
research
06/19/2023

Beyond Normal: On the Evaluation of Mutual Information Estimators

Mutual information is a general statistical dependency measure which has...
research
07/10/2019

The Design of Mutual Information

We derive the functional form of mutual information (MI) from a set of d...
research
01/19/2023

DiME: Maximizing Mutual Information by a Difference of Matrix-Based Entropies

We introduce an information-theoretic quantity with similar properties t...
research
02/03/2005

Estimating mutual information and multi--information in large networks

We address the practical problems of estimating the information relation...
research
12/07/2018

Information-Distilling Quantizers

Let X and Y be dependent random variables. This paper considers the prob...
research
06/03/2021

MINIMALIST: Mutual INformatIon Maximization for Amortized Likelihood Inference from Sampled Trajectories

Simulation-based inference enables learning the parameters of a model ev...

Please sign up or login with your details

Forgot password? Click here to reset