Outlier-Robust Learning of Ising Models Under Dobrushin's Condition

02/03/2021
by   Ilias Diakonikolas, et al.
5

We study the problem of learning Ising models satisfying Dobrushin's condition in the outlier-robust setting where a constant fraction of the samples are adversarially corrupted. Our main result is to provide the first computationally efficient robust learning algorithm for this problem with near-optimal error guarantees. Our algorithm can be seen as a special case of an algorithm for robustly learning a distribution from a general exponential family. To prove its correctness for Ising models, we establish new anti-concentration results for degree-2 polynomials of Ising models that may be of independent interest.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/11/2020

Learning Halfspaces with Tsybakov Noise

We study the efficient PAC learnability of halfspaces in the presence of...
research
07/30/2020

Outlier Robust Mean Estimation with Subgaussian Rates via Stability

We study the problem of outlier robust high-dimensional mean estimation ...
research
06/23/2016

Robust Learning of Fixed-Structure Bayesian Networks

We investigate the problem of learning Bayesian networks in an agnostic ...
research
05/15/2020

Efficiently Learning Adversarially Robust Halfspaces with Noise

We study the problem of learning adversarially robust halfspaces in the ...
research
11/19/2019

Outlier-Robust High-Dimensional Sparse Estimation via Iterative Filtering

We study high-dimensional sparse estimation tasks in a robust setting wh...
research
08/18/2020

Robust Mean Estimation on Highly Incomplete Data with Arbitrary Outliers

We study the problem of robustly estimating the mean of a d-dimensional ...
research
05/12/2021

Robust Learning of Fixed-Structure Bayesian Networks in Nearly-Linear Time

We study the problem of learning Bayesian networks where an ϵ-fraction o...

Please sign up or login with your details

Forgot password? Click here to reset