Safe-Bayesian Generalized Linear Regression

10/21/2019
by   Rianne de Heide, et al.
0

We study generalized Bayesian inference under misspecification, i.e. when the model is `wrong but useful'. Generalized Bayes equips the likelihood with a learning rate η. We show that for generalized linear models (GLMs), η-generalized Bayes concentrates around the best approximation of the truth within the model for specific η≠ 1, even under severely misspecified noise, as long as the tails of the true distribution are exponential. We then derive MCMC samplers for generalized Bayesian lasso and logistic regression, and give examples of both simulated and real-world data in which generalized Bayes outperforms standard Bayes by a vast margin.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/21/2020

A comparison of learning rate selection methods in generalized Bayesian inference

Generalized Bayes posterior distributions are formed by putting a fracti...
research
10/17/2015

A General Method for Robust Bayesian Modeling

Robust Bayesian models are appealing alternatives to standard models, pr...
research
12/12/2019

Diagnosing model misspecification and performing generalized Bayes' updates via probabilistic classifiers

Model misspecification is a long-standing enigma of the Bayesian inferen...
research
12/15/2019

Using bagged posteriors for robust inference and model criticism

Standard Bayesian inference is known to be sensitive to model misspecifi...
research
02/17/2023

Approximate Bayes Optimal Pseudo-Label Selection

Semi-supervised learning by self-training heavily relies on pseudo-label...
research
02/22/2018

Bayesian Lasso : Concentration and MCMC Diagnosis

Using posterior distribution of Bayesian LASSO we construct a semi-norm ...
research
08/22/2019

Hierarchical Bayes Modeling for Large-Scale Inference

Bayesian modeling is now ubiquitous in problems of large-scale inference...

Please sign up or login with your details

Forgot password? Click here to reset