Bayesian Robustness: A Nonasymptotic Viewpoint

07/27/2019
by   Kush Bhatia, et al.
18

We study the problem of robustly estimating the posterior distribution for the setting where observed data can be contaminated with potentially adversarial outliers. We propose Rob-ULA, a robust variant of the Unadjusted Langevin Algorithm (ULA), and provide a finite-sample analysis of its sampling distribution. In particular, we show that after T= Õ(d/ε_acc) iterations, we can sample from p_T such that dist(p_T, p^*) ≤ε_acc + Õ(ϵ), where ϵ is the fraction of corruptions. We corroborate our theoretical analysis with experiments on both synthetic and real-world data sets for mean estimation, regression and binary classification.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/04/2020

Robust M-Estimation Based Bayesian Cluster Enumeration for Real Elliptically Symmetric Distributions

Robustly determining the optimal number of clusters in a data set is an ...
research
05/21/2019

Robustness Against Outliers For Deep Neural Networks By Gradient Conjugate Priors

We analyze a new robust method for the reconstruction of probability dis...
research
10/22/2012

Reducing statistical time-series problems to binary classification

We show how binary classification methods developed to work on i.i.d. da...
research
04/07/2021

Prediction with Missing Data

Missing information is inevitable in real-world data sets. While imputat...
research
02/07/2018

Gradient conjugate priors and deep neural networks

The paper deals with learning the probability distribution of the observ...
research
07/06/2017

Simple Classification using Binary Data

Binary, or one-bit, representations of data arise naturally in many appl...
research
05/04/2018

Estimating Learnability in the Sublinear Data Regime

We consider the problem of estimating how well a model class is capable ...

Please sign up or login with your details

Forgot password? Click here to reset