Entropic Independence in High-Dimensional Expanders: Modified Log-Sobolev Inequalities for Fractionally Log-Concave Polynomials and the Ising Model

by   Nima Anari, et al.

We introduce a notion called entropic independence for distributions μ defined on pure simplicial complexes, i.e., subsets of size k of a ground set of elements. Informally, we call a background measure μ entropically independent if for any (possibly randomly chosen) set S, the relative entropy of an element of S drawn uniformly at random carries at most O(1/k) fraction of the relative entropy of S, a constant multiple of its “share of entropy.” Entropic independence is the natural analog of spectral independence, another recently established notion, if one replaces variance by entropy. In our main result, we show that μ is entropically independent exactly when a transformed version of the generating polynomial of μ can be upper bounded by its linear tangent, a property implied by concavity of the said transformation. We further show that this concavity is equivalent to spectral independence under arbitrary external fields, an assumption that also goes by the name of fractional log-concavity. Our result can be seen as a new tool to establish entropy contraction from the much simpler variance contraction inequalities. A key differentiating feature of our result is that we make no assumptions on marginals of μ or the degrees of the underlying graphical model when μ is based on one. We leverage our results to derive tight modified log-Sobolev inequalities for multi-step down-up walks on fractionally log-concave distributions. As our main application, we establish the tight mixing time of O(nlog n) for Glauber dynamics on Ising models with interaction matrix of operator norm smaller than 1, improving upon the prior quadratic dependence on n.



There are no comments yet.


page 1

page 2

page 3

page 4


Entropic Independence II: Optimal Sampling and Concentration via Restricted Modified Log-Sobolev Inequalities

We introduce a framework for obtaining tight mixing times for Markov cha...

Reversals of Rényi Entropy Inequalities under Log-Concavity

We establish a discrete analog of the Rényi entropy comparison due to Bo...

Log-Concave Polynomials IV: Exchange Properties, Tight Mixing Times, and Faster Sampling of Spanning Trees

We prove tight mixing time bounds for natural random walks on bases of m...

Fractionally Log-Concave and Sector-Stable Polynomials: Counting Planar Matchings and More

We show fully polynomial time randomized approximation schemes (FPRAS) f...

Isotropy and Log-Concave Polynomials: Accelerated Sampling and High-Precision Counting of Matroid Bases

We define a notion of isotropy for discrete set distributions. If μ is a...

Domain Sparsification of Discrete Distributions using Entropic Independence

We present a framework for speeding up the time it takes to sample from ...

Entropy inequalities for random walks and permutations

We consider a new functional inequality controlling the rate of relative...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.