Log In Sign Up

Resilience: A Criterion for Learning in the Presence of Arbitrary Outliers

by   Jacob Steinhardt, et al.

We introduce a criterion, resilience, which allows properties of a dataset (such as its mean or best low rank approximation) to be robustly computed, even in the presence of a large fraction of arbitrary additional data. Resilience is a weaker condition than most other properties considered so far in the literature, and yet enables robust estimation in a broader variety of settings. We provide new information-theoretic results on robust distribution learning, robust estimation of stochastic block models, and robust mean estimation under bounded kth moments. We also provide new algorithmic results on robust distribution learning, as well as robust mean estimation in ℓ_p-norms. Among our proof techniques is a method for pruning a high-dimensional distribution with bounded 1st moments to a stable "core" with bounded 2nd moments, which may be of independent interest.


page 1

page 2

page 3

page 4


Recent Advances in Algorithmic High-Dimensional Robust Statistics

Learning in the presence of outliers is a fundamental problem in statist...

Learning from Untrusted Data

The vast majority of theoretical results in machine learning and statist...

Private Robust Estimation by Stabilizing Convex Relaxations

We give the first polynomial time and sample (ϵ, δ)-differentially priva...

High-Dimensional Robust Mean Estimation via Gradient Descent

We study the problem of high-dimensional robust mean estimation in the p...

Outlier-robust moment-estimation via sum-of-squares

We develop efficient algorithms for estimating low-degree moments of unk...

Robust Mean Estimation under Coordinate-level Corruption

Data corruption, systematic or adversarial, may skew statistical estimat...