Estimating Ising Models from One Sample

04/20/2020
by   Yuval Dagan, et al.
0

Given one sample X ∈{± 1}^n from an Ising model [X=x]∝(x^ J x/2), whose interaction matrix satisfies J:= ∑_i=1^k β_i J_i for some known matrices J_i and some unknown parameters β_i, we study whether J can be estimated to high accuracy. Assuming that each node of the Ising model has bounded total interaction with the other nodes, i.e. J_∞< O(1), we provide a computationally efficient estimator Ĵ with the high probability guarantee Ĵ -J_F < O(√(k)), where J_F can be as high as Ω(√(n)). Our guarantee is tight when the interaction strengths are sufficiently low. An example application of our result is in social networks, wherein nodes make binary choices, x_1,...,x_n, which may be influenced at varying strengths β_i by different networks J_i in which these nodes belong. By observing a single snapshot of the nodes' behaviors the goal is to learn the combined correlation structure. When k=1 and a single parameter is to be inferred, we further show |β̂_1 - β_1| < O(F(β_1J_1)^-1/2), where F(β_1J_1) is the log-partition function of the model. This was proved in prior work under additional assumptions. We generalize these results to any setting. While our guarantees aim both high and low temperature regimes, our proof relies on sparsifying the correlation network by conditioning on subsets of the variables, such that the unconditioned variables satisfy Dobrushin's condition, i.e. a high temperature condition which allows us to apply stronger concentration inequalities. We use this to prove concentration and anti-concentration properties of the Ising model, and we believe this sparsification result has applications beyond the scope of this paper as well.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset