Learning a Tree-Structured Ising Model in Order to Make Predictions

04/22/2016
by   Guy Bresler, et al.
0

We study the problem of learning a tree graphical model from samples such that low-order marginals are accurate. We define a distance ("small set TV" or ssTV) between distributions P and Q by taking the maximum, over all subsets S of a given size, of the total variation between the marginals of P and Q on S. Approximating a distribution to within small ssTV allows making predictions based on partial observations. Focusing on pairwise marginals and tree-structured Ising models on p nodes with maximum edge strength β, we prove that {e^2β p, η^-2(p/η)} i.i.d. samples suffices to get a distribution (from the same class) with ssTV at most η from the one generating the samples.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/23/2022

Learning and Testing Latent-Tree Ising Models Efficiently

We provide time- and sample-efficient algorithms for learning and testin...
research
06/07/2021

Chow-Liu++: Optimal Prediction-Centric Learning of Tree Ising Models

We consider the problem of learning a tree-structured Ising model from d...
research
11/09/2020

Near-Optimal Learning of Tree-Structured Distributions by Chow-Liu

We provide finite sample guarantees for the classical Chow-Liu algorithm...
research
06/14/2022

On Approximating Total Variation Distance

Total variation distance (TV distance) is a fundamental notion of distan...
research
12/05/2019

On the Sample Complexity of Learning Sum-Product Networks

Sum-Product Networks (SPNs) can be regarded as a form of deep graphical ...
research
01/16/2014

Learning to Make Predictions In Partially Observable Environments Without a Generative Model

When faced with the problem of learning a model of a high-dimensional en...
research
10/28/2020

Sample-Optimal and Efficient Learning of Tree Ising models

We show that n-variable tree-structured Ising models can be learned comp...

Please sign up or login with your details

Forgot password? Click here to reset