Lower bounds for testing graphical models: colorings and antiferromagnetic Ising models

01/22/2019
by   Ivona Bezakova, et al.
0

We study the identity testing problem in the context of spin systems or undirected graphical models, where it takes the following form: given the parameter specification of the model M and a sampling oracle for the distribution μ_M̂ of an unknown model M̂, can we efficiently determine if the two models M and M̂ are the same? We consider identity testing for both soft-constraint and hard-constraint systems. In particular, we prove hardness results in two prototypical cases, the Ising model and proper colorings, and explore whether identity testing is any easier than structure learning. For the ferromagnetic (attractive) Ising model, Daskalasis et al. (2018) presented a polynomial time algorithm for identity testing. We prove hardness results in the antiferromagnetic (repulsive) setting in the same regime of parameters where structure learning is known to require a super-polynomial number of samples. In particular, for n-vertex graphs of maximum degree d, we prove that if |β| d = ω(n) (where β is the inverse temperature parameter), then there is no polynomial running time identity testing algorithm unless RP=NP. We also establish computational lower bounds for a broader set of parameters under the (randomized) exponential time hypothesis. Our proofs utilize insights into the design of gadgets using random graphs in recent works concerning the hardness of approximate counting by Sly (2010). In the hard-constraint setting, we present hardness results for identity testing for proper colorings. Our results are based on the presumed hardness of #BIS, the problem of (approximately) counting independent sets in bipartite graphs. In particular, we prove that identity testing is hard in the same range of parameters where structure learning is known to be hard.

READ FULL TEXT
research
04/22/2020

Hardness of Identity Testing for Restricted Boltzmann Machines and Potts models

We study identity testing for restricted Boltzmann machines (RBMs), and ...
research
06/23/2022

Approximating observables is as hard as counting

We study the computational complexity of estimating local observables fo...
research
09/12/2014

Hardness of parameter estimation in graphical models

We consider the problem of learning the canonical parameters specifying ...
research
08/17/2017

Structure Learning of H-colorings

We study the structure learning problem for graph homomorphisms, commonl...
research
11/27/2020

Tight Hardness Results for Training Depth-2 ReLU Networks

We prove several hardness results for training depth-2 neural networks w...
research
05/14/2016

Extended Hardness Results for Approximate Gröbner Basis Computation

Two models were recently proposed to explore the robust hardness of Gröb...
research
05/21/2020

Algebraic Hardness versus Randomness in Low Characteristic

We show that lower bounds for explicit constant-variate polynomials over...

Please sign up or login with your details

Forgot password? Click here to reset