
Fast and Accurate Variational Inference for Models with Many Latent Variables
Models with a large number of latent variables are often used to fully u...
read it

Forward Amortized Inference for LikelihoodFree Variational Marginalization
In this paper, we introduce a new form of amortized variational inferenc...
read it

Variational approximations using Fisher divergence
Modern applications of Bayesian inference involve models that are suffic...
read it

Try Depth Instead of Weight Correlations: Meanfield is a Less Restrictive Assumption for Deeper Networks
We challenge the longstanding assumption that the meanfield approximati...
read it

Conditional Variational Inference with Adaptive Truncation for Bayesian Nonparametric Models
The scalable inference for Bayesian nonparametric models with big data i...
read it

αVariational Inference with Statistical Guarantees
We propose a variational approximation to Bayesian posterior distributio...
read it

The Ising distribution as a latent variable model
We show that the Ising distribution can be treated as a latent variable ...
read it
Statistical Inference in MeanField Variational Bayes
We conduct nonasymptotic analysis on the meanfield variational inference for approximating posterior distributions in complex Bayesian models that may involve latent variables. We show that the meanfield approximation to the posterior can be wellapproximated relative to the KullbackLeibler divergence discrepancy measure by a normal distribution whose center is the maximum likelihood estimator (MLE). In particular, our results imply that the center of the meanfield approximation matches the MLE up to higherorder terms and there is essentially no loss of efficiency in using it as a point estimator for the parameter in any regular parametric model with latent variables. We also propose a new class of variational weighted likelihood bootstrap (VWLB) methods for quantifying the uncertainty in the meanfield variational inference. The proposed VWLB can be viewed as a new sampling scheme that produces independent samples for approximating the posterior. Comparing with traditional sampling algorithms such Markov Chain Monte Carlo, VWLB can be implemented in parallel and is free of tuning.
READ FULL TEXT
Comments
There are no comments yet.