A discrete complement of Lyapunov's inequality and its information theoretic consequences

11/13/2021
by   James Melbourne, et al.
0

We establish a reversal of Lyapunov's inequality for monotone log-concave sequences, settling a conjecture of Havrilla-Tkocz and Melbourne-Tkocz. A strengthened version of the same conjecture is disproved through counter example. We also derive several information theoretic inequalities as consequences. In particular sharp bounds are derived for the varentropy, Rényi entropies, and the concentration of information of monotone log-concave random variables. Moreover, the majorization approach utilized in the proof of the main theorem, is applied to derive analogous information theoretic results in the symmetric setting, where the Lyapunov reversal is known to fail.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/12/2022

An Information-Theoretic Proof of the Kac–Bernstein Theorem

A short, information-theoretic proof of the Kac–Bernstein theorem, which...
research
02/22/2019

A Family of Bayesian Cramér-Rao Bounds, and Consequences for Log-Concave Priors

Under minimal regularity assumptions, we establish a family of informati...
research
12/18/2022

Entropy-variance inequalities for discrete log-concave random variables via degree of freedom

We utilize a discrete version of the notion of degree of freedom to prov...
research
10/23/2022

Entropic exercises around the Kneser-Poulsen conjecture

We develop an information-theoretic approach to study the Kneser–Poulsen...
research
01/30/2019

Support Recovery in the Phase Retrieval Model: Information-Theoretic Fundamental Limits

The support recovery problem consists of determining a sparse subset of ...
research
08/28/2022

A Reformulation of Gaussian Completely Monotone Conjecture: A Hodge Structure on the Fisher Information along Heat Flow

In the past decade, J. Huh solved several long-standing open problems on...

Please sign up or login with your details

Forgot password? Click here to reset