A series of maximum entropy upper bounds of the differential entropy

12/09/2016
by   Frank Nielsen, et al.
0

We present a series of closed-form maximum entropy upper bounds for the differential entropy of a continuous univariate random variable and study the properties of that series. We then show how to use those generic bounds for upper bounding the differential entropy of Gaussian mixture models. This requires to calculate the raw moments and raw absolute moments of Gaussian mixtures in closed-form that may also be handy in statistical machine learning and information theory. We report on our experiments and discuss on the tightness of those bounds.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/19/2016

Guaranteed bounds on the Kullback-Leibler divergence of univariate mixtures using piecewise log-sum-exp inequalities

Information-theoretic measures such as the entropy, cross-entropy and th...
research
02/09/2023

Mathematical Model of Quantum Channel Capacity

In this article, we are proposing a closed-form solution for the capacit...
research
03/10/2022

Entropy Rate Bounds via Second-Order Statistics

This work contains two single-letter upper bounds on the entropy rate of...
research
10/16/2012

Tightening Fractional Covering Upper Bounds on the Partition Function for High-Order Region Graphs

In this paper we present a new approach for tightening upper bounds on t...
research
06/29/2018

Guaranteed Deterministic Bounds on the Total Variation Distance between Univariate Mixtures

The total variation distance is a core statistical distance between prob...
research
03/18/2021

Maximum Entropy Reinforcement Learning with Mixture Policies

Mixture models are an expressive hypothesis class that can approximate a...
research
07/07/2023

Gaussian Mixture Identifiability from degree 6 Moments

We resolve most cases of identifiability from sixth-order moments for Ga...

Please sign up or login with your details

Forgot password? Click here to reset