Factorized Mutual Information Maximization

06/13/2019
by   Thomas Merkh, et al.
0

We investigate the sets of joint probability distributions that maximize the average multi-information over a collection of margins. These functionals serve as proxies for maximizing the multi-information of a set of variables or the mutual information of two subsets of variables, at a lower computation and estimation complexity. We describe the maximizers and their relations to the maximizers of the multi-information and the mutual information.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

06/21/2021

On Study of Mutual Information and its Estimation Methods

The presence of mutual information in the research of deep learning has ...
07/11/2019

Generalized Mutual Information

Mutual information is one of the essential building blocks of informatio...
07/10/2019

The Design of Mutual Information

We derive the functional form of mutual information (MI) from a set of d...
02/03/2005

Estimating mutual information and multi--information in large networks

We address the practical problems of estimating the information relation...
12/07/2018

Information-Distilling Quantizers

Let X and Y be dependent random variables. This paper considers the prob...
06/03/2021

MINIMALIST: Mutual INformatIon Maximization for Amortized Likelihood Inference from Sampled Trajectories

Simulation-based inference enables learning the parameters of a model ev...
01/04/2018

Modeling Log-linear and Logit Models in Categorical Data Analysis

The association between categorical variables is analyzed using the mutu...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.