DeepAI
Log In Sign Up

Resilience: A Criterion for Learning in the Presence of Arbitrary Outliers

03/15/2017
by   Jacob Steinhardt, et al.
0

We introduce a criterion, resilience, which allows properties of a dataset (such as its mean or best low rank approximation) to be robustly computed, even in the presence of a large fraction of arbitrary additional data. Resilience is a weaker condition than most other properties considered so far in the literature, and yet enables robust estimation in a broader variety of settings. We provide new information-theoretic results on robust distribution learning, robust estimation of stochastic block models, and robust mean estimation under bounded kth moments. We also provide new algorithmic results on robust distribution learning, as well as robust mean estimation in ℓ_p-norms. Among our proof techniques is a method for pruning a high-dimensional distribution with bounded 1st moments to a stable "core" with bounded 2nd moments, which may be of independent interest.

READ FULL TEXT

page 1

page 2

page 3

page 4

11/14/2019

Recent Advances in Algorithmic High-Dimensional Robust Statistics

Learning in the presence of outliers is a fundamental problem in statist...
11/07/2016

Learning from Untrusted Data

The vast majority of theoretical results in machine learning and statist...
12/07/2021

Private Robust Estimation by Stabilizing Convex Relaxations

We give the first polynomial time and sample (ϵ, δ)-differentially priva...
05/04/2020

High-Dimensional Robust Mean Estimation via Gradient Descent

We study the problem of high-dimensional robust mean estimation in the p...
11/30/2017

Outlier-robust moment-estimation via sum-of-squares

We develop efficient algorithms for estimating low-degree moments of unk...
02/10/2020

Robust Mean Estimation under Coordinate-level Corruption

Data corruption, systematic or adversarial, may skew statistical estimat...