Nonasymptotic Laplace approximation under model misspecification

05/16/2020
by   Anirban Bhattacharya, et al.
0

We present non-asymptotic two-sided bounds to the log-marginal likelihood in Bayesian inference. The classical Laplace approximation is recovered as the leading term. Our derivation permits model misspecification and allows the parameter dimension to grow with the sample size. We do not make any assumptions about the asymptotic shape of the posterior, and instead require certain regularity conditions on the likelihood ratio and that the posterior to be sufficiently concentrated.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/28/2023

Tight Dimension Dependence of the Laplace Approximation

In Bayesian inference, a widespread technique to approximately sample fr...
research
10/17/2022

On the Tightness of the Laplace Approximation for Statistical Inference

Laplace's method is used to approximate intractable integrals in a wide ...
research
01/14/2021

Bayesian inference with tmbstan for a state-space model with VAR(1) state equation

When using R package tmbstan for Bayesian inference, the built-in featur...
research
06/12/2023

Tight skew adjustment to the Laplace approximation in high dimensions

In Bayesian inference, a simple and popular approach to reduce the burde...
research
04/23/2022

Dimension free non-asymptotic bounds on the accuracy of high dimensional Laplace approximation

This note attempts to revisit the classical results on Laplace approxima...
research
02/27/2013

Laplace's Method Approximations for Probabilistic Inference in Belief Networks with Continuous Variables

Laplace's method, a family of asymptotic methods used to approximate int...
research
01/17/2022

Finite samples inference and critical dimension for stochastically linear models

The aim of this note is to state a couple of general results about the p...

Please sign up or login with your details

Forgot password? Click here to reset