Information Theoretic Structured Generative Modeling

10/12/2021
by   Bo Hu, et al.
13

Rényi's information provides a theoretical foundation for tractable and data-efficient non-parametric density estimation, based on pair-wise evaluations in a reproducing kernel Hilbert space (RKHS). This paper extends this framework to parametric probabilistic modeling, motivated by the fact that Rényi's information can be estimated in closed-form for Gaussian mixtures. Based on this special connection, a novel generative model framework called the structured generative model (SGM) is proposed that makes straightforward optimization possible, because costs are scale-invariant, avoiding high gradient variance while imposing less restrictions on absolute continuity, which is a huge advantage in parametric information theoretic optimization. The implementation employs a single neural network driven by an orthonormal input appended to a single white noise source adapted to learn an infinite Gaussian mixture model (IMoG), which provides an empirically tractable model distribution in low dimensions. To train SGM, we provide three novel variational cost functions, based on Rényi's second-order entropy and divergence, to implement minimization of cross-entropy, minimization of variational representations of f-divergence, and maximization of the evidence lower bound (conditional probability). We test the framework for estimation of mutual information and compare the results with the mutual information neural estimation (MINE), for density estimation, for conditional probability estimation in Markov models as well as for training adversarial networks. Our preliminary results show that SGM significantly improves MINE estimation in terms of data efficiency and variance, conventional and variational Gaussian mixture models, as well as the performance of generative adversarial networks.

READ FULL TEXT

page 11

page 14

page 24

page 25

page 26

page 28

research
06/08/2017

Estimating Mixture Entropy with Pairwise Distances

Mixture distributions arise in many parametric and non-parametric settin...
research
06/14/2017

Information Potential Auto-Encoders

In this paper, we suggest a framework to make use of mutual information ...
research
03/28/2023

Information-Theoretic GAN Compression with Variational Energy-based Model

We propose an information-theoretic knowledge distillation approach for ...
research
06/02/2016

f-GAN: Training Generative Neural Samplers using Variational Divergence Minimization

Generative neural samplers are probabilistic models that implement sampl...
research
05/17/2020

C-MI-GAN : Estimation of Conditional Mutual Information using MinMax formulation

Estimation of information theoretic quantities such as mutual informatio...
research
11/17/2020

Reducing the Variance of Variational Estimates of Mutual Information by Limiting the Critic's Hypothesis Space to RKHS

Mutual information (MI) is an information-theoretic measure of dependenc...
research
06/08/2021

Marginalizable Density Models

Probability density models based on deep networks have achieved remarkab...

Please sign up or login with your details

Forgot password? Click here to reset