Parameter estimation for Gibbs distributions

by   David G. Harris, et al.

We consider Gibbs distributions, which are families of probability distributions over a discrete space Ω with probability mass function of the form μ^Ω_β(ω) ∝ e^β H(ω) for β in an interval [β_min, β_max] and H(ω) ∈{0 }∪ [1, n]. The partition function is the normalization factor Z(β)=∑_ω∈Ωe^β H(ω). Two important parameters of these distributions are the partition ratio q = logZ(β_max)Z(β_min) and the counts c_x = |H^-1(x)|. These are correlated with system parameters in a number of physical applications and sampling algorithms. Our first main result is to estimate the values c_x using roughly Õ( q/ε^2) samples for general Gibbs distributions and Õ( n^2/ε^2 ) samples for integer-valued distributions (ignoring some second-order terms and parameters), and we show this is optimal up to logarithmic factors. We illustrate with improved algorithms for counting connected subgraphs and perfect matchings in a graph. A key subroutine we develop is to estimate the partition function Z; specifically, we generate a data structure capable of estimating Z(β) for all values β, without further samples. Constructing the data structure requires Õ(q/ε^2) samples for general Gibbs distributions and Õ(n^2/ε^2) samples for integer-valued distributions. This improves over a prior algorithm of Kolmogorov (2018) which computes the single point estimate Z(β_max) using Õ(q/ε^2) samples. We show matching lower bounds, demonstrating that this complexity is optimal as a function of n and q up to logarithmic terms.



There are no comments yet.


page 1

page 2

page 3

page 4


Parameter estimation for integer-valued Gibbs distributions

We consider the family of Gibbs distributions, which are probability dis...

Optimal Testing of Discrete Distributions with High Probability

We study the problem of testing discrete distributions with a focus on t...

Fast Doubly-Adaptive MCMC to Estimate the Gibbs Partition Function with Weak Mixing Time Bounds

We present a novel method for reducing the computational complexity of r...

Dense Distributions from Sparse Samples: Improved Gibbs Sampling Parameter Estimators for LDA

We introduce a novel approach for estimating Latent Dirichlet Allocation...

Programmable Quantum Annealers as Noisy Gibbs Samplers

Drawing independent samples from high-dimensional probability distributi...

TMI: Thermodynamic inference of data manifolds

The Gibbs-Boltzmann distribution offers a physically interpretable way t...

Massively Parallel Construction of Radix Tree Forests for the Efficient Sampling of Discrete Probability Distributions

We compare different methods for sampling from discrete probability dist...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.