Weighted Bayesian Bootstrap for Scalable Bayes

03/12/2018
by   Michael Newton, et al.
0

We develop a weighted Bayesian Bootstrap (WBB) for machine learning and statistics. WBB provides uncertainty quantification by sampling from a high dimensional posterior distribution. WBB is computationally fast and scalable using only off-theshelf optimization software such as TensorFlow. We provide regularity conditions which apply to a wide range of machine learning and statistical models. We illustrate our methodology in regularized regression, trend filtering and deep learning. Finally, we conclude with directions for future research.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/26/2021

Deep Learning Partial Least Squares

High dimensional data reduction techniques are provided by using partial...
research
10/17/2021

Centroid Approximation for Bootstrap

Bootstrap is a principled and powerful frequentist statistical tool for ...
research
01/14/2022

A generalized likelihood based Bayesian approach for scalable joint regression and covariance selection in high dimensions

The paper addresses joint sparsity selection in the regression coefficie...
research
10/22/2021

Merging Two Cultures: Deep and Statistical Learning

Merging the two cultures of deep and statistical learning provides insig...
research
06/01/2020

Scalable Uncertainty Quantification via GenerativeBootstrap Sampler

It has been believed that the virtue of using statistical procedures is ...
research
08/17/2022

Quantum Bayes AI

Quantum Bayesian AI (Q-B) is an emerging field that levers the computati...
research
03/18/2019

Deep Fundamental Factor Models

Deep fundamental factor models are developed to interpret and capture no...

Please sign up or login with your details

Forgot password? Click here to reset