Asymptotic Consistency of Loss-Calibrated Variational Bayes

11/04/2019
by   Prateek Jaiswal, et al.
0

This paper establishes the asymptotic consistency of the loss-calibrated variational Bayes (LCVB) method. LCVB was proposed in <cit.> as a method for approximately computing Bayesian posteriors in a `loss aware' manner. This methodology is also highly relevant in general data-driven decision-making contexts. Here, we not only establish the asymptotic consistency of the calibrated approximate posterior, but also the asymptotic consistency of decision rules. We also establish the asymptotic consistency of decision rules obtained from a `naive' variational Bayesian procedure.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/09/2017

Frequentist Consistency of Variational Bayes

A key challenge for modern Bayesian statistics is how to perform scalabl...
research
01/10/2022

Loss-calibrated expectation propagation for approximate Bayesian decision-making

Approximate Bayesian inference methods provide a powerful suite of tools...
research
02/05/2019

Asymptotic Consistency of α-Rényi-Approximate Posteriors

In this work, we study consistency properties of α-Rényi approximate pos...
research
03/12/2019

Risk-Sensitive Variational Bayes: Formulations and Bounds

We study data-driven decision-making problems in a parametrized Bayesian...
research
04/26/2023

Data-driven Piecewise Affine Decision Rules for Stochastic Programming with Covariate Information

Focusing on stochastic programming (SP) with covariate information, this...
research
01/03/2023

The E-Posterior

We develop a representation of a decision maker's uncertainty based on e...
research
07/19/2019

Adaptive sampling-based quadrature rules for efficient Bayesian prediction

A novel method is proposed to infer Bayesian predictions of computationa...

Please sign up or login with your details

Forgot password? Click here to reset